谷歌浏览器插件
订阅小程序
在清言上使用

Graph As Point Set

ICML 2024(2024)

引用 0|浏览36
暂无评分
摘要
Graph is a fundamental data structure to model interconnections betweenentities. Set, on the contrary, stores independent elements. To learn graphrepresentations, current Graph Neural Networks (GNNs) primarily use messagepassing to encode the interconnections. In contrast, this paper introduces anovel graph-to-set conversion method that bijectively transforms interconnectednodes into a set of independent points and then uses a set encoder to learn thegraph representation. This conversion method holds dual significance. Firstly,it enables using set encoders to learn from graphs, thereby significantlyexpanding the design space of GNNs. Secondly, for Transformer, a specific setencoder, we provide a novel and principled approach to inject graph informationlosslessly, different from all the heuristic structural/positional encodingmethods adopted in previous graph transformers. To demonstrate theeffectiveness of our approach, we introduce Point Set Transformer (PST), atransformer architecture that accepts a point set converted from a graph asinput. Theoretically, PST exhibits superior expressivity for both short-rangesubstructure counting and long-range shortest path distance tasks compared toexisting GNNs. Extensive experiments further validate PST's outstandingreal-world performance. Besides Transformer, we also devise a Deepset-based setencoder, which achieves performance comparable to representative GNNs,affirming the versatility of our graph-to-set method.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要