Improving Massively Multilingual Neural Machine Translation and Zero-Shot Translation

Zhang Biao
Zhang Biao
Williams Philip
Williams Philip

ACL, pp. 1628-1639, 2020.

Cited by: 18|Views55
EI
Weibo:
We show that multilingual neural machine translation suffers from weak capacity, and propose to enhance it by deepening the Transformer and devising language-aware neural models

Abstract:

Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but often underperform bilingual models and deliver poor zero-shot translations. In this paper, we explore ways to improve them. We argue that multilingual NMT requires stronger modeling capacity to support language pairs with varying typolo...More
0
Your rating :
0

 

Tags
Comments