EGAT: Edge-Featured Graph Attention Network

arxiv(2021)

引用 15|浏览37
暂无评分
摘要
Most state-of-the-art Graph Neural Networks focus on node features in the learning process but ignore edge features. However, edge features also contain essential information in real-world, such as financial graphs. Node-centric approaches are suboptimal in edge-sensitive graphs since edge features are not adequately utilized. To address this problem, we present the Edge-Featured Graph Attention Network (EGAT) to leverage edge features in the graph feature representation. Our model is based on the edge-integrated attention mechanism, where both node and edge features are included in the calculation of the message and attention weights. In addition, the importance of edge information suggests that the edge features should be updated to learn high-level representation. So we perform edge updating with the integration of the features of connected nodes. In contrast to edge-node switching, our model acquires the adjacent edge features with the node-transit strategy, avoiding significant lift of computational complexity. Then we employ a multi-scale merge strategy, which concatenates features of every layer to construct hierarchical representation. Moreover, our model can be adapted to domain-specific graph neural networks, which further extends the application scenarios. Experiments show that our model achieves or matches the state-of-the-art on both node-sensitive and edge-sensitive datasets.
更多
查看译文
关键词
Graph neural network, Edge feature, Attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要