Multilingual Translation with Extensible Multilingual Pretraining and Finetuning
Abstract:
Recent work demonstrates the potential of multilingual pretraining of creating one model that can be used for various tasks in different languages. Previous work in multilingual pretraining has demonstrated that machine translation systems can be created by finetuning on bitext. In this work, we show that multilingual translation models...More
Code:
Data:
Full Text
Tags
Comments