Incorporating Phrase-Level Agreement into Neural Machine Translation.

international conference natural language processing(2020)

引用 1|浏览31
暂无评分
摘要
Phrase information has been successfully integrated into current state-of-the-art neural machine translation (NMT) models. However, the natural property of the source and target phrase alignment has not been explored. In this paper, we propose a novel phrase-level agreement method to deal with this problem. First, we explore n-gram models over minimal translation units (MTUs) to explicitly capture aligned bilingual phrases from the parallel corpora. Then, we propose a phrase-level agreement loss that directly reduces the difference between the representations of the source-side and target-side phrase. Finally, we integrate the phrase-level agreement loss into the NMT models, to improve the translation performance. Empirical results on the NIST Chinese-to-English and the WMT English-to-German translation tasks demonstrate that the proposed phrase-level agreement method achieves significant improvements over state-of-the-art baselines, demonstrating the effectiveness and necessity of exploiting phrase-level agreement for NMT.
更多
查看译文
关键词
neural machine translation,machine translation,phrase-level
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要