OAG$_{\mathrm {know}}$ : Self-Supervised Learning for Linking Knowledge Graphs

user-57f9ed429ed5dbd78af2a05d(2023)

引用 6|浏览41449
暂无评分
摘要
We propose a self-supervised embedding learning framework—SelfLinKG—to link concepts in heterogeneous knowledge graphs. Without any labeled data, SelfLinKG can achieve competitive performance against its supervised counterpart, and significantly outperforms state-of-the-art unsupervised methods by 26%-50% under linear classification protocol. The essential components of SelfLinKG are local attention-based encoding and momentum contrastive learning. The former aims to learn the graph representation using an attention network, while the latter is to learn a self-supervised model across knowledge graphs using contrastive learning. SelfLinKG has been deployed to build the the new version, called OAG $_{\mathrm {know}}$ of Open Academic Graph (OAG). All data and codes are publicly available.
更多
查看译文
关键词
Concept linking,self-supervised learning,contrastive learning,knowledge base
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要