Constant-Time Machine Translation with Conditional Masked Language Models

International Joint Conference on Natural Language Processing, 2019.

Cited by: 0|Bibtex|Views57
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|arxiv.org

Abstract:

Most machine translation systems generate text autoregressively, by sequentially predicting tokens from left to right. We, instead, use a masked language modeling objective to train a model to predict any subset of the target words, conditioned on both the input text and a partially masked target translation. This approach allows for effi...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments