Multilingual Translation with Extensible Multilingual Pretraining and Finetuning

Cited by: 0|Bibtex|Views36
Other Links: arxiv.org

Abstract:

Recent work demonstrates the potential of multilingual pretraining of creating one model that can be used for various tasks in different languages. Previous work in multilingual pretraining has demonstrated that machine translation systems can be created by finetuning on bitext. In this work, we show that multilingual translation models...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments