Talking-Heads Attention
arxiv(2020)
摘要
We introduce "talking-heads attention" - a variation on multi-head attention which includes linearprojections across the attention-heads dimension, immediately before and after the softmax operation.While inserting only a small number of additional parameters and a moderate amount of additionalcomputation, talking-heads attention leads to better perplexities on masked language modeling tasks, aswell as better quality when transfer-learning to language comprehension and question answering tasks.
更多查看译文
关键词
attention,talking-heads
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络