Leveraging Human Prior Knowledge To Learn Sense Representations

ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE(2020)

引用 0|浏览32
暂无评分
摘要
Conventional distributed word representation learning, which learns a single vector for each word, is unable to represent different meanings of polysemous words. To address this issue, a number of approaches were proposed to model individual word senses in recent years. However, most of these sense representations are hard to be integrated into downstream tasks. In this paper, we propose a knowledge-based method to learn word sense representations that can offer effective support in downstream tasks. More specifically, we propose to capture the semantic information of prior human knowledge from sememes, the minimum semantic units of meaning, to build global sense context vectors and perform a reliable soft word sense disambiguation for polysemous words. We extend the framework of Skip-gram model with a contextual attention mechanism to learn an individual embedding for each sense. The intrinsic experimental results show that our proposed method can capture the distinct and exact meanings of senses and outperform previous work on the classic word similarity task. The extrinsic experiment and further analysis show that our sense embeddings can be utilized to effectively improve performance and mitigate the impact of polysemy in multiple real-word downstream tasks.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要