谷歌浏览器插件
订阅小程序
在清言上使用

Novel Multi‐domain Attention for Abstractive Summarisation

Chunxia Qu,Ling Lu, Aijuan Wang,Wu Yang,Yinong Chen

CAAI transactions on intelligence technology(2022)

引用 2|浏览4
暂无评分
摘要
AbstractThe existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper.
更多
查看译文
关键词
abstractive summarisation,attention mechanism,Bi‐LSTM,coverage mechanism,pointer network,abstracting,recurrent neural nets,text analysis,word processing,convolutional neural nets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要