Alleviating the Inequality of Attention Heads for Neural Machine Translation.

International Conference on Computational Linguistics(2022)

引用 7|浏览206
暂无评分
摘要
Recent studies show that the attention heads in Transformer are not equal. We relate this phenomenon to the imbalance training of multi-head attention and the model dependence on specific heads. To tackle this problem, we propose a simple masking method: HeadMask, in two specific ways. Experiments show that translation improvements are achieved on multiple language pairs. Subsequent empirical analyses also support our assumption and confirm the effectiveness of the method.
更多
查看译文
关键词
attention heads,translation,neural machine
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要