Overcoming Catastrophic Forgetting During Domain Adaptation of Neural Machine Translation
north american chapter of the association for computational linguistics(2019)
摘要
Continued training is an effective method for domain adaptation in neural machine translation. However, in-domain gains from adaptation come at the expense of general-domain performance. In this work, we interpret the drop in general-domain performance as catastrophic forgetting of general-domain knowledge. To mitigate it, we adapt Elastic Weight Consolidation (EWC)-a machine learning method for learning a new task without forgetting previous tasks. Our method retains the majority of general-domain performance lost in continued training without degrading in-domain performance, outperforming the previous state-of-the-art. We also explore the full range of general-domain performance available when some in-domain degradation is acceptable.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络