A Relation-aware Attention Neural Network for Modeling the Usage of Scientific Online Resources

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 1|浏览25
暂无评分
摘要
More and more online resources for computer science are introduced, used and released in scientific literature in recent years. Knowledge about the usage of these online resources can help researchers easily find the applicable resources for their works. However, most existing methods ignore the importance of the content of the online resource citations. To this end, we manually create SciR, a dataset that contains 3,012 annotation sentences for this task, and introduce a multi-task learning framework to automatically extract the entities and relations from the context of online resource citations in scientific papers. Furthermore, considering the words in a sentence usually play different roles under different relations. In this paper, we treat different relations as distinctive sub-spaces and model the correlations between words in sentence for each relation type by a supervised biaffine attention network. Based on this relationa-ware attention network, our model can not only effectively obtain the word-level correlations under each relation, but also naturally avoid the problem of overlapping relations. To evaluate the effectiveness of our model, we conduct comprehensive experiments on three datasets and the experimental results demonstrate that our model outperforms other state-of-the-art methods on the two tasks of entity recognition and relation extraction.
更多
查看译文
关键词
relation extraction,entity extraction,SciR dataset,computer science,annotation sentences,word-level correlations,supervised biaffine attention network,multitask learning framework,online resource citations,scientific online resources,relation-aware attention neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要