谷歌浏览器插件
订阅小程序
在清言上使用

Two-Step Multi-Factor Attention Neural Network For Answer Selection

PRICAI 2018: TRENDS IN ARTIFICIAL INTELLIGENCE, PT I(2018)

引用 4|浏览45
暂无评分
摘要
Attention-based neural network models recently proposed have achieved great success in question answering task. They focus on introducing the interaction information in sentence modeling rather than representing the question and the answer individually. However, there are some limitations of the previous work. First, in the interaction layer, most attention mechanisms do not make full use of the diverse semantic information of the question. Second, they have limited capability to construct interaction from multiple aspects. In this paper, to address these two limitations, we propose a two-step multi-factor attention neural network model. The two-step strategy encodes the question into different representations according to separate words in the answer, and these representations are employed to build dynamic-question-aware attention. Additionally, a multi-factor mechanism is introduced to extract various interaction information, which aims at aggregating meaningful facts distributed in different matching results. The experimental results on three traditional QA datasets show that our model outperforms various state-of-the-art systems.
更多
查看译文
关键词
Answer selection, Neural network, Two-step attention, Multi-factor attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要