谷歌浏览器插件
订阅小程序
在清言上使用

Enhancing knowledge graph embedding with relational constraints

Neurocomputing(2021)

引用 13|浏览35
暂无评分
摘要
Knowledge graph embedding is studied to embed entities and relations of a knowledge graph into continuous vector spaces, which benefits a variety of real-world applications. Among existing solutions, translational models, which employ geometric translation to design score function, have drawn much attention. However, these models primarily concentrate on evidence from observing whether the triplets are plausible, and ignore the fact that the relation also implies certain semantic constraints on its subject or object entity. In this paper, we present a general framework for enhancing knowledge graph embedding with relational constraints (KRC). Specifically, we elaborately design the score function by encoding regularities between a relation and its arguments into the translational embedding space. Additionally, we propose a soft margin-based ranking loss for effectively training the KRC model, which characterizes different semantic distances between negative and positive triplets. Furthermore, we combine regularities with distributional representations to predict the missing triplets, which can possess certain robust guarantee. We evaluate our method on the tasks of knowledge graph completion and entity classification. Extensive experiments show that KRC achieves a better, or comparable performance against state-of-the-art methods. Besides, KRC makes a great improvement when dealing with long-tail entities, which have few instances in the knowledge graph. (C) 2020 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Knowledge graph embedding,Translational model,Relational constraints,Knowledge graph completion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要