Improving Massively Multilingual Neural Machine Translation and Zero-Shot Translation
ACL, pp. 1628-1639, 2020.
We show that multilingual neural machine translation suffers from weak capacity, and propose to enhance it by deepening the Transformer and devising language-aware neural models
Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but often underperform bilingual models and deliver poor zero-shot translations. In this paper, we explore ways to improve them. We argue that multilingual NMT requires stronger modeling capacity to support language pairs with varying typolo...More
PPT (Upload PPT)