Predicting Target Language CCG Supertags Improves Neural Machine Translation

WMT, pp. 68-79, 2017.

Cited by: 37|Bibtex|Views100
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|arxiv.org

Abstract:

Neural machine translation (NMT) models are able to partially learn syntactic information from sequential lexical information. Still, some complex syntactic phenomena such as prepositional phrase attachment are poorly modeled. This work aims to answer two questions: 1) Does explicitly modeling target language syntax help NMT? 2) Is tight ...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments