Exphormer: Scaling Graph Transformers with Expander Graphs

ICLR 2023(2023)

引用 1|浏览33
暂无评分
摘要
Graph transformers have emerged as a promising architecture for a variety of graph learning and representation tasks. Despite their successes, it remains challenging to scale graph transformers to large graphs while maintaining accuracy competitive with message-passing networks. In this paper, we introduce Exphormer, a framework for building powerful and scalable graph transformers. Exphormer consists of a sparse attention mechanism based on expander graphs, whose mathematical characteristics, such as spectral expansion, and sparsity, yield graph transformers with complexity only linear in the size of the graph, while allowing us to prove desirable theoretical properties of the resulting transformer models. We show that incorporating Exphormer into the recently-proposed GraphGPS framework produces models with competitive empirical results on a wide variety of graph datasets, including state-of-the-art results on three datasets. We also show that Exphormer can scale to datasets on larger graphs than shown in previous graph transformer architectures.
更多
查看译文
关键词
Graph neural networks,Transformers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要