Non-Local Attention Learning On Large Heterogeneous Information Networks
2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA)(2019)
摘要
Heterogeneous information network (HIN) summarizes rich structural information in real-world datasets and plays an important role in many big data applications. Recently, graph neural networks have been extended to the representation learning of HIN. One very recent advancement is the hierarchical attention mechanism which incorporates both node-wise and semantic-wise attention. However, since HIN is more likely to be densely connected given its diverse types of edges, repeatedly applying graph convolutional layers can make the node embeddings indistinguishable very quickly. In order to avoid oversmoothness, existing graph neural networks targeting HIN generally suffer from a shallow structure. Consequently, those approaches ignore information beyond the local neighborhood. This design flaw violates the concept of non-local learning, which emphasizes the importance of capturing long-range dependencies. To properly address this limitation, we propose a novel framework of non-local attention in heterogeneous information networks (NLAH). Our framework utilizes a non-local attention structure to complement the hierarchical attention mechanism. In this way, it leverages both local and non-local information simultaneously. Moreover, a weighted sampling schema is designed for NLAH to reduce the computation cost for large-scale datasets. Extensive experiments on three different real-world heterogeneous information networks illustrate that our framework exhibits extraordinary scalability and outperforms state-of-the-art baselines with significant margins.
更多查看译文
关键词
semantic-wise attention,HIN,graph convolutional layers,graph neural networks,local neighborhood,heterogeneous information network,nonlocal attention structure,hierarchical attention mechanism,nonlocal information,structural information,representation learning,nonlocal attention learning,big data applications,nodewise attention,node embeddings,long-range dependencies,weighted sampling schema,NLAH
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络