Augmenting Recurrent Graph Neural Networks with a Cache.

KDD(2023)

引用 1|浏览187
暂无评分
摘要
While graph neural networks (GNNs) provide a powerful way to learn structured representations, it remains challenging to learn long-range dependencies in graphs. Recurrent GNNs only partly address this problem. In this paper, we propose a general approach for augmenting recurrent GNNs with a cache memory to improve their expressivity, especially for modeling long-range dependencies. Specifically, we first introduce a method of augmenting recurrent GNNs with a cache of previous hidden states. Then we further propose a general Cache-GNN framework by adding additional modules, including attention mechanism and positional/structural encoders, to improve the expressivity. We show that the Cache-GNNs outperforms other models on synthetic datasets as well as tasks on real-world datasets that require long-range information.
更多
查看译文
关键词
graph neural networks,memory,attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要