An Exploration of Neural Sequence-to-Sequence Architectures for Automatic Post-Editing

international joint conference on natural language processing, 2017.

Cited by: 3|Views25
EI

Abstract:

In this work, we explore multiple neural architectures adapted for the task of automatic post-editing of machine translation output. We focus on neural end-to-end models that combine both inputs $mt$ (raw MT output) and $src$ (source language input) in a single neural architecture, modeling ${mt, src} rightarrow pe$ directly. Apart from t...More

Code:

Data:

Your rating :
0

 

Tags
Comments