Knowledge Representation Learning Via Dynamic Relation Spaces

2016 IEEE 16th International Conference on Data Mining Workshops (ICDMW)(2016)

引用 1|浏览37
暂无评分
摘要
Knowledge graphs are an important part in AI domain, and contain large scale of structured knowledge, but they are far from completeness. Previous translation models, such as TransE, TransH, TransR/CTransR and TransD, use a relation vector to translate head entity vector, the result of translation is close to tail entity vector. Compared with other classical models, these translation models achieve state-of-the-art performance. In this paper, we propose a more flexible model named TransDR, which is an improvement of TransD. In TransDR, we use two vectors to represent each entity and three vectors to represent relation. Compared with TransD, TransDR adds another vector for each relation which could not only represent model more flexibly but also reduce the noise from other relation spaces. In experiments, we evaluate our model on two typical tasks including triplets classification and link prediction. Experiment results show significant and consistent improvements compared to previous state-of-the-art models.
更多
查看译文
关键词
knowledge representation learning,dynamic relation spaces,TransDR,triplets classification,link prediction,knowledge graphs
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要