An Attention Mechanism for Neural Answer Selection Using a Combined Global and Local View

CoRR(2017)

引用 10|浏览11
暂无评分
摘要
We propose a new attention mechanism for neural based question answering, which depends on varying granularities of the input. Previous work focused on augmenting recurrent neural networks for question answering systems with simple attention mechanisms which are a function of the similarity between a question embedding and an answer embeddings across time. We extend this by making the attention mechanism dependent on a global embedding of the answer attained using a separate network. We evaluate our system on InsuranceQA, a large question answering dataset. Our model outperforms current state-of-the-art results on InsuranceQA. Further, we examine which sections of text our attention mechanism focuses on, and explore its performance across different parameter settings.
更多
查看译文
关键词
Question Answering,Answer Selection,Neural Networks,Deep Learning,Recurrent Neural-Networks,Attention Mechanism,Natural Language-Processing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要