Local structure-aware graph contrastive representation learning

Kai Yang, Yuan Liu,Zijuan Zhao, Peijin Ding, Wenqian Zhao

NEURAL NETWORKS(2024)

引用 0|浏览6
暂无评分
摘要
Traditional Graph Neural Network (GNN), as a graph representation learning method, is constrained by label information. However, Graph Contrastive Learning (GCL) methods, which tackles the label problem effectively, mainly focus on the feature information of the global graph or small subgraph structure (e.g., the first -order neighborhood). In this paper, we propose a Local Structure -aware Graph Contrastive representation Learning method (LS-GCL) to model the structural information of nodes from multiple views. Specifically, we construct the semantic subgraphs that are not limited to the first -order neighbors. For the local view, the semantic subgraph of each target node is input into a shared GNN encoder to obtain the target node embeddings at the subgraph-level. Then, we use a pooling function to generate the subgraph-level graph embeddings. For the global view, considering the original graph preserves indispensable semantic information of nodes, we leverage the shared GNN encoder to learn the target node embeddings at the global graph -level. The proposed LS-GCL model is optimized to maximize the common information among similar instances at three various perspectives through a multi -level contrastive loss function. Experimental results on six datasets illustrate that our method outperforms state-of-the-art graph representation learning approaches for both node classification and link prediction tasks.
更多
查看译文
关键词
Graph representation learning,Graph neural network,Self-supervised learning,Graph contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要