Pre training Multilingual Neural Machine Translation by Leveraging Alignment Information
EMNLP 2020, pp. 2649-2663, 2020.
We propose a multilingual neural machine translation pre-training model
We investigate the following question for machine translation (MT): can we develop a single universal MT model to serve as the common seed and obtain derivative and improved models on arbitrary language pairs? We propose mRASP, an approach to pre-train a universal multilingual neural machine translation model. Our key idea in mRASP is its...More
PPT (Upload PPT)