Multilingual Denoising Pre-training for Neural Machine Translation.

Transactions of the Association for Computational Linguistics(2020)

引用 1501|浏览1177
暂无评分
摘要
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-seque...
更多
查看译文
关键词
neural machine translation,pre-training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要