Graph Convolutional Networks with EigenPooling

KDD, pp. 723.0-731.0, 2019.

Cited by: 22|Bibtex|Views56|DOI:https://doi.org/10.1145/3292500.3330982
EI
Other Links: academic.microsoft.com|dblp.uni-trier.de|dl.acm.org|arxiv.org
Weibo:
We propose a pooling operator based on local graph Fourier transform, which utilizes the subgraph structure as well as the node features for generating the supernode representations

Abstract:

Graph neural networks, which generalize deep neural network models to graph structured data, have attracted increasing attention in recent years. They usually learn node representations by transforming, propagating and aggregating node features and have been proven to improve the performance of many graph related tasks such as node classi...More

Code:

Data:

Introduction
  • Recent years have witnessed increasing interests in generalizing neural networks for graph structured data.
  • In a graph of a protein, atoms are connected via bonds; some local structures, which consist of groups of atoms and their direct bonds, can represent some specific functional units, which, in turn, are important to tell the functionality of the entire protein [3, 11, 37]
  • These local structures are not captured during the global summarizing process.
  • To generate the graph representation which preserves the local and global graph structures, a hierarchical pooling process, analogous to the pooling process in conventional convolutional neural (CNN) networks [23], is needed
Highlights
  • Recent years have witnessed increasing interests in generalizing neural networks for graph structured data
  • We aim to develop a Graph Neural Networks (GNN) model, which consists of convolutional layers and pooling layers, to learn graph representations such that graph level classification can be applied
  • We propose a pooling operator based on local graph Fourier transform, which utilizes the subgraph structure as well as the node features for generating the supernode representations
  • We design EigenPooling, a pooling operator based on local graph Fourier transform, which can extract subgraph information utilizing both node features and structure of the subgraph
  • The pooling operator together with a subgraph-based graph coarsening method forms the pooling layer, which can be incorporated into any graph neural networks to hierarchically learn graph level representations
  • We further proposed a graph neural network framework EigenGCN by combining the proposed pooling layers with the Graph Convolutional Networks convolutional layers
Results
  • The authors' proposed framework achieves state-of-the-art performance on most of the data sets, which demonstrates its effectiveness.
Conclusion
  • The authors design EigenPooling, a pooling operator based on local graph Fourier transform, which can extract subgraph information utilizing both node features and structure of the subgraph.
  • The pooling operator together with a subgraph-based graph coarsening method forms the pooling layer, which can be incorporated into any graph neural networks to hierarchically learn graph level representations.
  • The authors further proposed a graph neural network framework EigenGCN by combining the proposed pooling layers with the GCN convolutional layers.
  • The authors' proposed framework achieves state-of-the-art performance on most of the data sets, which demonstrates its effectiveness
Summary
  • Introduction:

    Recent years have witnessed increasing interests in generalizing neural networks for graph structured data.
  • In a graph of a protein, atoms are connected via bonds; some local structures, which consist of groups of atoms and their direct bonds, can represent some specific functional units, which, in turn, are important to tell the functionality of the entire protein [3, 11, 37]
  • These local structures are not captured during the global summarizing process.
  • To generate the graph representation which preserves the local and global graph structures, a hierarchical pooling process, analogous to the pooling process in conventional convolutional neural (CNN) networks [23], is needed
  • Results:

    The authors' proposed framework achieves state-of-the-art performance on most of the data sets, which demonstrates its effectiveness.
  • Conclusion:

    The authors design EigenPooling, a pooling operator based on local graph Fourier transform, which can extract subgraph information utilizing both node features and structure of the subgraph.
  • The pooling operator together with a subgraph-based graph coarsening method forms the pooling layer, which can be incorporated into any graph neural networks to hierarchically learn graph level representations.
  • The authors further proposed a graph neural network framework EigenGCN by combining the proposed pooling layers with the GCN convolutional layers.
  • The authors' proposed framework achieves state-of-the-art performance on most of the data sets, which demonstrates its effectiveness
Tables
  • Table1: Statistics of datasets
  • Table2: Performance comparison
Download tables as Excel
Related work
  • In recent years, graph neural network models, which try to extend deep neural network models to graph structured data, have attracted increasing interests. These graph neural network models have been applied to various applications in many different areas. In [22], a graph neural network model that tries to learn node representation by aggregating the node features from its neighbors, is applied to perform semi-supervised node classification. Similar methods were later proposed to further enhance the performance by including attention mechanism [42]. GraphSage [48], which allows more flexible aggregation procedure, was designed for the same task. There are some graph neural networks models designed to reason the dynamics of physical systems where the model is applied to predict future states of nodes given their previous states [1, 32].
Funding
  • Yao Ma and Jiliang Tang are supported by the National Science Foundation (NSF) under grant numbers IIS-1714741, IIS-1715940, IIS-1845081 and CNS-1815636, and a grant from Criteo Faculty Research Award
Reference
  • Peter Battaglia, Razvan Pascanu, Matthew Lai, Danilo Jimenez Rezende, et al.
    Google ScholarLocate open access versionFindings
  • 2016. Interaction networks for learning about objects, relations and physics. In NIPS. 4502–4510.
    Google ScholarFindings
  • [2] Peter W Battaglia, Jessica B Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinicius Zambaldi, Mateusz Malinowski, Andrea Tacchetti, David Raposo, Adam Santoro, Ryan Faulkner, et al. 2018. Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261 (2018).
    Findings
  • [3] Karsten M Borgwardt, Cheng Soon Ong, Stefan Schönauer, SVN Vishwanathan, Alex J Smola, and Hans-Peter Kriegel. 2005. Protein function prediction via graph kernels. Bioinformatics 21, suppl_1 (2005), i47–i56.
    Google ScholarLocate open access versionFindings
  • [4] Joan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. 2013. Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013).
    Findings
  • [5] Siheng Chen, Aliaksei Sandryhaila, Jose MF Moura, and Jelena Kovacevic. 2014. Signal denoising on graphs via graph filtering. In GlobalSIP.
    Google ScholarFindings
  • [6] Fan RK Chung and Fan Chung Graham. 199Spectral graph theory. Number 92. American Mathematical Soc.
    Google ScholarFindings
  • [7] Hanjun Dai, Bo Dai, and Le Song. 2016. Discriminative embeddings of latent variable models for structured data. In ICML. 2702–2711.
    Google ScholarFindings
  • [8] Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional neural networks on graphs with fast localized spectral filtering. In NIPS.
    Google ScholarFindings
  • [9] Tyler Derr, Yao Ma, and Jiliang Tang. 2018. Signed graph convolutional networks. In 2018 IEEE International Conference on Data Mining (ICDM). IEEE, 929–934.
    Google ScholarLocate open access versionFindings
  • [10] Paul D Dobson and Andrew J Doig. 2003. Distinguishing enzyme structures from non-enzymes without alignments. JMB 330, 4 (2003), 771–783.
    Google ScholarLocate open access versionFindings
  • [11] David K Duvenaud, Dougal Maclaurin, Jorge Iparraguirre, Rafael Bombarell, Timothy Hirzel, Alán Aspuru-Guzik, and Ryan P Adams. 2015. Convolutional networks on graphs for learning molecular fingerprints. In NIPS. 2224–2232.
    Google ScholarFindings
  • [12] Wenqi Fan, Yao Ma, Qing Li, Yuan He, Eric Zhao, Jiliang Tang, and Dawei Yin. 2019. Graph Neural Networks for Social Recommendation. arXiv preprint arXiv:1902.07243 (2019).
    Findings
  • [13] Matthias Fey, Jan Eric Lenssen, Frank Weichert, and Heinrich Müller. 2018.
    Google ScholarFindings
  • [14] Hongyang Gao and Shuiwang Ji. 2019. Graph representation learning via hard and channel-wise attention networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM.
    Google ScholarLocate open access versionFindings
  • [15] Hongyang Gao and Shuiwang Ji. 2019. Graph U-nets. In Proceedings of The 36th International Conference on Machine Learning.
    Google ScholarLocate open access versionFindings
  • [16] Hongyang Gao, Zhengyang Wang, and Shuiwang Ji. 2018. Large-scale learnable graph convolutional networks. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 1416–1424.
    Google ScholarLocate open access versionFindings
  • [17] Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. 2017. Neural Message Passing for Quantum Chemistry. In ICML. 1263–1272.
    Google ScholarFindings
  • [18] Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In NIPS. 1024–1034.
    Google ScholarFindings
  • [19] Mikael Henaff, Joan Bruna, and Yann LeCun. 2015. Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015).
    Findings
  • [20] Jeroen Kazius, Ross McGuire, and Roberta Bursi. 2005. Derivation and validation of toxicophores for mutagenicity prediction. JMC 48, 1 (2005), 312–320.
    Google ScholarLocate open access versionFindings
  • [21] Kristian Kersting, Nils M. Kriege, Christopher Morris, Petra Mutzel, and Marion Neumann. 2016. Benchmark Data Sets for Graph Kernels. http://graphkernels.
    Findings
  • [22] Thomas N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016).
    Findings
  • [23] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. Imagenet classification with deep convolutional neural networks. In NIPS. 1097–1105.
    Google ScholarFindings
  • [24] Ron Levie, Federico Monti, Xavier Bresson, and Michael M Bronstein. 2017. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. arXiv preprint arXiv:1705.07664 (2017).
    Findings
  • [25] Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard Zemel. 2015. Gated graph sequence neural networks. arXiv preprint arXiv:1511.05493 (2015).
    Findings
  • [26] Yao Ma, Ziyi Guo, Zhaochun Ren, Eric Zhao, Jiliang Tang, and Dawei Yin. 2018. Dynamic graph neural networks. arXiv preprint arXiv:1810.10627 (2018).
    Findings
  • [27] Yao Ma, Suhang Wang, Chara C Aggarwal, Dawei Yin, and Jiliang Tang. 2019. Multi-dimensional Graph Convolutional Networks. In Proceedings of the 2019 SIAM International Conference on Data Mining. SIAM, 657–665.
    Google ScholarLocate open access versionFindings
  • [28] Federico Monti, Michael Bronstein, and Xavier Bresson. 2017. Geometric matrix completion with recurrent multi-graph neural networks. In Advances in Neural Information Processing Systems. 3697–3707.
    Google ScholarLocate open access versionFindings
  • [29] Sunil K Narang and Antonio Ortega. 2012. Perfect reconstruction two-channel wavelet filter banks for graph structured data. IEEE Transactions on Signal Processing 60, 6 (2012), 2786–2799.
    Google ScholarLocate open access versionFindings
  • [30] Mathias Niepert, Mohamed Ahmed, and Konstantin Kutzkov. 2016. Learning convolutional neural networks for graphs. In ICML. 2014–2023.
    Google ScholarFindings
  • [31] Kaspar Riesen and Horst Bunke. 2008. IAM graph database repository for graph based pattern recognition and machine learning. In Joint IAPR International Workshops on SPR and (SSPR). Springer, 287–297.
    Google ScholarLocate open access versionFindings
  • [32] Alvaro Sanchez-Gonzalez, Nicolas Heess, Jost Tobias Springenberg, Josh Merel, Martin Riedmiller, Raia Hadsell, and Peter Battaglia. 2018. Graph networks as learnable physics engines for inference and control. arXiv preprint arXiv:1806.01242 (2018).
    Findings
  • [33] Aliaksei Sandryhaila and José MF Moura. [n. d.]. Discrete signal processing on graphs. IEEE transactions on signal processing 61, 7 ([n. d.]), 1644–1656.
    Google ScholarLocate open access versionFindings
  • [34] Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2009. The graph neural network model. IEEE Transactions on Neural Networks 20, 1 (2009), 61–80.
    Google ScholarLocate open access versionFindings
  • [35] Michael Schlichtkrull, Thomas N Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, and Max Welling. 2018. Modeling relational data with graph convolutional networks. In European Semantic Web Conference. Springer, 593–607.
    Google ScholarFindings
  • [36] Ida Schomburg, Antje Chang, Christian Ebeling, Marion Gremse, Christian Heldt, Gregor Huhn, and Dietmar Schomburg. 2004. BRENDA, the enzyme database: updates and major new developments. Nucleic acids research 32, suppl_1 (2004), D431–D433.
    Google ScholarLocate open access versionFindings
  • [37] Nino Shervashidze, Pascal Schweitzer, Erik Jan van Leeuwen, Kurt Mehlhorn, and Karsten M Borgwardt. 2011. Weisfeiler-lehman graph kernels. JMLR 12, Sep (2011), 2539–2561.
    Google ScholarLocate open access versionFindings
  • [38] David I Shuman, Sunil K Narang, Pascal Frossard, Antonio Ortega, and Pierre Vandergheynst. 2013. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Processing Magazine 30, 3 (2013), 83–98.
    Google ScholarLocate open access versionFindings
  • [39] Martin Simonovsky and Nikos Komodakis. 2017. Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs. In CVPR. 3693–3702.
    Google ScholarLocate open access versionFindings
  • [40] Nicolas Tremblay and Pierre Borgnat. [n. d.]. Subgraph-based filterbanks for graph signals. IEEE Transactions on Signal Processing 64, 15 ([n. d.]), 3827–3840.
    Google ScholarLocate open access versionFindings
  • [41] Rakshit Trivedi, Hanjun Dai, Yichen Wang, and Le Song. 2017. Know-evolve: Deep temporal reasoning for dynamic knowledge graphs. In Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 3462– 3471.
    Google ScholarLocate open access versionFindings
  • [42] Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2017. Graph Attention Networks. arXiv preprint arXiv:1710.10903 (2017).
    Findings
  • [43] Oriol Vinyals, Samy Bengio, and Manjunath Kudlur. 2015. Order matters: Sequence to sequence for sets. arXiv preprint arXiv:1511.06391 (2015).
    Findings
  • [44] Nikil Wale, Ian A Watson, and George Karypis. 2008. Comparison of descriptor spaces for chemical compound retrieval and classification. Knowledge and Information Systems 14, 3 (2008), 347–375.
    Google ScholarLocate open access versionFindings
  • [45] Xiaolong Wang, Yufei Ye, and Abhinav Gupta. 2018. Zero-shot recognition via semantic embeddings and knowledge graphs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 6857–6866.
    Google ScholarLocate open access versionFindings
  • [46] Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S Yu. 2019. A comprehensive survey on graph neural networks. arXiv preprint arXiv:1901.00596 (2019).
    Findings
  • [47] Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L Hamilton, and Jure Leskovec. 2018. Graph convolutional neural networks for web-scale recommender systems. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 974–983.
    Google ScholarLocate open access versionFindings
  • [48] Zhitao Ying, Jiaxuan You, Christopher Morris, Xiang Ren, Will Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. In NIPS. 4805–4815.
    Google ScholarFindings
  • [49] Muhan Zhang, Zhicheng Cui, Marion Neumann, and Yixin Chen. 2018. An End-to-End Deep Learning Architecture for Graph Classification. (2018).
    Google ScholarFindings
  • [50] Ziwei Zhang, Peng Cui, and Wenwu Zhu. 2018. Deep learning on graphs: A survey. arXiv preprint arXiv:1812.04202 (2018).
    Findings
  • [51] Jie Zhou, Ganqu Cui, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, and Maosong Sun. 2018. Graph Neural Networks: A Review of Methods and Applications. arXiv preprint arXiv:1812.08434 (2018).
    Findings
Full Text
Your rating :
0

 

Tags
Comments