Pre training Multilingual Neural Machine Translation by Leveraging Alignment Information

EMNLP 2020, pp. 2649-2663, 2020.

Cited by: 4|Views145
Weibo:
We propose a multilingual neural machine translation pre-training model

Abstract:

We investigate the following question for machine translation (MT): can we develop a single universal MT model to serve as the common seed and obtain derivative and improved models on arbitrary language pairs? We propose mRASP, an approach to pre-train a universal multilingual neural machine translation model. Our key idea in mRASP is its...More
0
Full Text
Bibtex
Weibo
Your rating :
0

 

Tags
Comments