Graph Structure of Neural Networks
international conference on machine learning, 2020.
Weibo:
Abstract:
Lots of learning tasks require dealing with graph data which contains rich relation information among elements. Modeling physics system, learning molecular fingerprints, predicting protein interface, and classifying diseases require a model to learn from graph inputs. In other domains such as learning from non-structural data like texts a...More
Code:
Data:
Introduction
- Graphs are a kind of data structure which models a set of objects and their relationships.
- As a unique non-Euclidean data structure for machine learning, graph analysis focuses on node classification, link prediction, and clustering.
- Graph neural networks (GNNs) are deep learning based methods that operate on graph domain.
- Due to its convincing performance and high interpretability, GNN has been a widely applied graph analysis method recently.
- The authors will illustrate the fundamental motivations of graph neural networks
Highlights
- Graphs are a kind of data structure which models a set of objects and their relationships
- There exist several comprehensive reviews on graph neural networks. [22] proposed a unified framework, MoNet, to generalize CNN architectures to non-Euclidean domains and the framework could generalize several spectral methods on graphs [2], [23] as well as some models on manifolds [24], [25]. [26] provides a thorough review of geometric deep learning, which presents its problems, difficulties, solutions, applications and future directions. [22] and [26] focus on generalizing convolutions to graphs or manifolds, in this paper we only focus on problems defined on graphs and we investigate other mechanisms used in graph neural networks such as gate mechanism, attention mechanism and skip connection. [27] proposed the message passing neural network (MPNN) which could generalize several graph neural network and graph convolutional network approaches. [28] proposed the non-local neural network (NLNN) which unifies several “self-attention”-style methods
- We introduce graph convolutional networks and graph attention networks in Section 2.2.2 as they contribute to the propagation step
- This paper presents an extensive survey of graph neural networks with the following contributions
- For Graph neural networks models, we introduce its variants categorized by graph types, propagation types, and training types
- We suggest four open problems indicating the major challenges and future research directions of graph neural networks, including model depth, scalability, the ability to deal with dynamic graphs and non-structural scenarios
Methods
- 1st-order model Single parameter GCN Neural FPs. Graph Attention Networks GAT Aggregator Nk = Tk(L )X N0 = X.
- The original graph convolutional neural network has several drawbacks in training and optimization methods.
- To solve the problems mentioned above, GraphSAGE replaced full graph Laplacian with learnable aggregation functions, which are key to perform message passing and generalize to unseen nodes.
- With learned aggregation and propagation functions, GraphSAGE could generate embeddings for unseen nodes.
- GraphSAGE uses neighbor sampling to alleviate receptive field expansion
Conclusion
- Over the past few years, graph neural networks have become powerful and practical tools for machine learning tasks in graph domain.
- This progress owes to advances in expressive power, model flexibility, and training algorithms.
- In this survey, the authors conduct a comprehensive review of graph neural networks.
- The authors suggest four open problems indicating the major challenges and future research directions of graph neural networks, including model depth, scalability, the ability to deal with dynamic graphs and non-structural scenarios
Summary
Introduction:
Graphs are a kind of data structure which models a set of objects and their relationships.- As a unique non-Euclidean data structure for machine learning, graph analysis focuses on node classification, link prediction, and clustering.
- Graph neural networks (GNNs) are deep learning based methods that operate on graph domain.
- Due to its convincing performance and high interpretability, GNN has been a widely applied graph analysis method recently.
- The authors will illustrate the fundamental motivations of graph neural networks
Methods:
1st-order model Single parameter GCN Neural FPs. Graph Attention Networks GAT Aggregator Nk = Tk(L )X N0 = X.- The original graph convolutional neural network has several drawbacks in training and optimization methods.
- To solve the problems mentioned above, GraphSAGE replaced full graph Laplacian with learnable aggregation functions, which are key to perform message passing and generalize to unseen nodes.
- With learned aggregation and propagation functions, GraphSAGE could generate embeddings for unseen nodes.
- GraphSAGE uses neighbor sampling to alleviate receptive field expansion
Conclusion:
Over the past few years, graph neural networks have become powerful and practical tools for machine learning tasks in graph domain.- This progress owes to advances in expressive power, model flexibility, and training algorithms.
- In this survey, the authors conduct a comprehensive review of graph neural networks.
- The authors suggest four open problems indicating the major challenges and future research directions of graph neural networks, including model depth, scalability, the ability to deal with dynamic graphs and non-structural scenarios
Tables
- Table1: Notations used in this paper
- Table2: Different variants of graph neural networks
- Table3: Applications of graph neural networks
Reference
- W. L. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” NIPS 2017, pp. 1024–1034, 2017.
- T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” ICLR 2017, 2017.
- A. Sanchez-Gonzalez, N. Heess, J. T. Springenberg, J. Merel, M. Riedmiller, R. Hadsell, and P. Battaglia, “Graph networks as learnable physics engines for inference and control,” arXiv preprint arXiv:1806.01242, 2018.
- P. Battaglia, R. Pascanu, M. Lai, D. J. Rezende et al., “Interaction networks for learning about objects, relations and physics,” in NIPS 2016, 2016, pp. 4502–4510.
- A. Fout, J. Byrd, B. Shariat, and A. Ben-Hur, “Protein interface prediction using graph convolutional networks,” in NIPS 2017, 2017, pp. 6530–6539.
- T. Hamaguchi, H. Oiwa, M. Shimbo, and Y. Matsumoto, “Knowledge transfer for out-of-knowledge-base entities: A graph neural network approach,” in IJCAI 2017, 2017, pp. 1802–1808.
- H. Dai, E. B. Khalil, Y. Zhang, B. Dilkina, and L. Song, “Learning combinatorial optimization algorithms over graphs,” arXiv preprint arXiv:1704.01665, 2017.
- Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998.
- Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 2015, vol. 521, no. 7553, p. 436, 2015.
- F. R. Chung and F. C. Graham, Spectral graph theory. American Mathematical Soc., 1997, no. 92.
- P. Cui, X. Wang, J. Pei, and W. Zhu, “A survey on network embedding,” IEEE Transactions on Knowledge and Data Engineering, 2018.
- W. L. Hamilton, R. Ying, and J. Leskovec, “Representation learning on graphs: Methods and applications.” IEEE Data(base) Engineering Bulletin, vol. 40, pp. 52–74, 2017.
- D. Zhang, J. Yin, X. Zhu, and C. Zhang, “Network representation learning: A survey,” IEEE transactions on Big Data, 2018.
- H. Cai, V. W. Zheng, and K. C.-C. Chang, “A comprehensive survey of graph embedding: Problems, techniques, and applications,” IEEE Transactions on Knowledge and Data Engineering, vol. 30, no. 9, pp. 1616–1637, 2018.
- P. Goyal and E. Ferrara, “Graph embedding techniques, applications, and performance: A survey,” Knowledge-Based Systems, vol. 151, pp. 78–94, 2018.
- T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” arXiv preprint arXiv:1301.3781, 2013.
- B. Perozzi, R. Al-Rfou, and S. Skiena, “Deepwalk: Online learning of social representations,” in SIGKDD 2014. ACM, 2014, pp. 701– 710.
- A. Grover and J. Leskovec, “node2vec: Scalable feature learning for networks,” in SIGKDD. ACM, 2016, pp. 855–864.
- J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei, “Line: Large-scale information network embedding,” in WWW 2015, 2015, pp. 1067–1077.
- C. Yang, Z. Liu, D. Zhao, M. Sun, and E. Y. Chang, “Network representation learning with rich text information.” in IJCAI 2015, 2015, pp. 2111–2117.
- T. Kawamoto, M. Tsubaki, and T. Obuchi, “Mean-field theory of graph neural networks in graph partitioning,” in NeurIPS 2018, 2018, pp. 4366–4376.
- F. Monti, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, and M. M. Bronstein, “Geometric deep learning on graphs and manifolds using mixture model cnns,” CVPR 2017, pp. 5425–5434, 2017.
- J. Atwood and D. Towsley, “Diffusion-convolutional neural networks,” in NIPS 2016, 2016, pp. 1993–2001.
- J. Masci, D. Boscaini, M. Bronstein, and P. Vandergheynst, “Geodesic convolutional neural networks on riemannian manifolds,” in ICCV workshops 2015, 2015, pp. 37–45.
- D. Boscaini, J. Masci, E. Rodola, and M. Bronstein, “Learning shape correspondence with anisotropic convolutional neural networks,” in NIPS 2016, 2016, pp. 3189–3197.
- M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, “Geometric deep learning: going beyond euclidean data,” IEEE SPM 2017, vol. 34, no. 4, pp. 18–42, 2017.
- J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” arXiv preprint arXiv:1704.01212, 2017.
- X. Wang, R. Girshick, A. Gupta, and K. He, “Non-local neural networks,” arXiv preprint arXiv:1711.07971, vol. 10, 2017.
- J. B. Lee, R. A. Rossi, S. Kim, N. K. Ahmed, and E. Koh, “Attention models in graphs: A survey,” arXiv preprint arXiv:1807.07984, 2018.
- P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner et al., “Relational inductive biases, deep learning, and graph networks,” arXiv preprint arXiv:1806.01261, 2018.
- Z. Zhang, P. Cui, and W. Zhu, “Deep learning on graphs: A survey,” arXiv preprint arXiv:1812.04202, 2018.
- Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, “A comprehensive survey on graph neural networks,” arXiv preprint arXiv:1901.00596, 2019.
- F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE TNN 2009, vol. 20, no. 1, pp. 61–80, 2009.
- M. A. Khamsi and W. A. Kirk, An introduction to metric spaces and fixed point theory. John Wiley & Sons, 2011, vol. 53.
- M. Kampffmeyer, Y. Chen, X. Liang, H. Wang, Y. Zhang, and E. P. Xing, “Rethinking knowledge graph propagation for zero-shot learning,” arXiv preprint arXiv:1805.11724, 2018.
- Y. Zhang, Y. Xiong, X. Kong, S. Li, J. Mi, and Y. Zhu, “Deep collective classification in heterogeneous information networks,” in WWW 2018, 2018, pp. 399–408.
- X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, and P. S. Yu, “Heterogeneous graph attention network,” WWW 2019, 2019.
- D. Beck, G. Haffari, and T. Cohn, “Graph-to-sequence learning using gated graph neural networks,” in ACL 2018, 2018, pp. 273– 283.
- M. Schlichtkrull, T. N. Kipf, P. Bloem, R. van den Berg, I. Titov, and M. Welling, “Modeling relational data with graph convolutional networks,” in ESWC 2018. Springer, 2018, pp. 593–607.
- Y. Li, R. Yu, C. Shahabi, and Y. Liu, “Diffusion convolutional recurrent neural network: Data-driven traffic forecasting,” arXiv preprint arXiv:1707.01926, 2017.
- B. Yu, H. Yin, and Z. Zhu, “Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting,” arXiv preprint arXiv:1709.04875, 2017.
- A. Jain, A. R. Zamir, S. Savarese, and A. Saxena, “Structural-rnn: Deep learning on spatio-temporal graphs,” in CVPR 2016, 2016, pp. 5308–5317.
- S. Yan, Y. Xiong, and D. Lin, “Spatial temporal graph convolutional networks for skeleton-based action recognition,” in ThirtySecond AAAI Conference on Artificial Intelligence, 2018.
- N. Peng, H. Poon, C. Quirk, K. Toutanova, and W.-t. Yih, “Cross-sentence n-ary relation extraction with graph lstms,” arXiv preprint arXiv:1708.03743, 2017.
- J. Bruna, W. Zaremba, A. Szlam, and Y. Lecun, “Spectral networks and locally connected networks on graphs,” ICLR 2014, 2014.
- M. Henaff, J. Bruna, and Y. Lecun, “Deep convolutional networks on graph-structured data.” arXiv: Learning, 2015.
- D. K. Hammond, P. Vandergheynst, and R. Gribonval, “Wavelets on graphs via spectral graph theory,” Applied and Computational Harmonic Analysis, vol. 30, no. 2, pp. 129–150, 2011.
- M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” NIPS 2016, pp. 3844–3852, 2016.
- R. Li, S. Wang, F. Zhu, and J. Huang, “Adaptive graph convolutional neural networks,” in AAAI 2018, 2018.
- Y. C. Ng, N. Colombo, and R. Silva, “Bayesian semi-supervised learning with graph gaussian processes,” in NeurIPS 2018, 2018, pp. 1690–1701.
- D. K. Duvenaud, D. Maclaurin, J. Aguileraiparraguirre, R. Gomezbombarelli, T. D. Hirzel, A. Aspuruguzik, and R. P. Adams, “Convolutional networks on graphs for learning molecular fingerprints,” NIPS 2015, pp. 2224–2232, 2015.
- C. Zhuang and Q. Ma, “Dual graph convolutional networks for graph-based semi-supervised classification,” in WWW 2018, 2018.
- M. Niepert, M. Ahmed, and K. Kutzkov, “Learning convolutional neural networks for graphs,” in ICML 2016, 2016, pp. 2014–2023.
- H. Gao, Z. Wang, and S. Ji, “Large-scale learnable graph convolutional networks,” in Proceedings of SIGKDD. ACM, 2018, pp. 1416–1424.
- K. He, X. Zhang, S. Ren, and J. Sun, “Identity mappings in deep residual networks,” in ECCV 2016. Springer, 2016, pp. 630–645.
- J. Chang, J. Gu, L. Wang, G. Meng, S. Xiang, and C. Pan, “Structure-aware convolutional neural networks,” in NeurIPS 2018, 2018, pp. 11–20.
- K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio, “Learning phrase representations using rnn encoder–decoder for statistical machine translation,” EMNLP 2014, pp. 1724–1734, 2014.
- S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, vol. 9, no. 8, pp. 1735–1780, 1997.
- Y. Li, D. Tarlow, M. Brockschmidt, and R. S. Zemel, “Gated graph sequence neural networks,” arXiv: Learning, 2016.
- K. S. Tai, R. Socher, and C. D. Manning, “Improved semantic representations from tree-structured long short-term memory networks,” IJCNLP 2015, pp. 1556–1566, 2015.
- V. Zayats and M. Ostendorf, “Conversation modeling on reddit using a graph-structured lstm,” TACL 2018, vol. 6, pp. 121–132, 2018.
- Y. Zhang, Q. Liu, and L. Song, “Sentence-state lstm for text representation,” ACL 2018, vol. 1, pp. 317–327, 2018.
- X. Liang, X. Shen, J. Feng, L. Lin, and S. Yan, “Semantic object parsing with graph lstm,” ECCV 2016, pp. 125–143, 2016.
- D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” ICLR 2015, 2015.
- J. Gehring, M. Auli, D. Grangier, and Y. N. Dauphin, “A convolutional encoder model for neural machine translation,” ACL 2017, vol. 1, pp. 123–135, 2017.
- A. Vaswani, N. Shazeer, N. Parmar, L. Jones, J. Uszkoreit, A. N. Gomez, and L. Kaiser, “Attention is all you need,” NIPS 2017, pp. 5998–6008, 2017.
- J. Cheng, L. Dong, and M. Lapata, “Long short-term memorynetworks for machine reading,” EMNLP 2016, pp. 551–561, 2016.
- P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” ICLR 2018, 2018.
- J. Zhang, X. Shi, J. Xie, H. Ma, I. King, and D.-Y. Yeung, “Gaan: Gated attention networks for learning on large and spatiotemporal graphs,” arXiv preprint arXiv:1803.07294, 2018.
- K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” CVPR 2016, pp. 770–778, 2016.
- A. Rahimi, T. Cohn, and T. Baldwin, “Semi-supervised user geolocation via graph convolutional networks,” ACL 2018, vol. 1, pp. 2009–2019, 2018.
- J. G. Zilly, R. K. Srivastava, J. Koutnik, and J. Schmidhuber, “Recurrent highway networks.” ICML 2016, pp. 4189–4198, 2016.
- T. Pham, T. Tran, D. Phung, and S. Venkatesh, “Column networks for collective classification,” in AAAI 2017, 2017.
- K. Xu, C. Li, Y. Tian, T. Sonobe, K. Kawarabayashi, and S. Jegelka, “Representation learning on graphs with jumping knowledge networks,” ICML 2018, pp. 5449–5458, 2018.
- M. Simonovsky and N. Komodakis, “Dynamic edge-conditioned filters in convolutional neural networks on graphs,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 3693–3702.
- Z. Ying, J. You, C. Morris, X. Ren, W. Hamilton, and J. Leskovec, “Hierarchical graph representation learning with differentiable pooling,” in NeurIPS 2018, 2018, pp. 4805–4815.
- R. Ying, R. He, K. Chen, P. Eksombatchai, W. L. Hamilton, and J. Leskovec, “Graph convolutional neural networks for web-scale recommender systems,” in SIGKDD 2018, 2018.
- J. Chen, T. Ma, and C. Xiao, “Fastgcn: fast learning with graph convolutional networks via importance sampling,” arXiv preprint arXiv:1801.10247, 2018.
- W. Huang, T. Zhang, Y. Rong, and J. Huang, “Adaptive sampling towards fast graph representation learning,” in Proceedings of NeurIPS, 2018, pp. 4563–4572.
- H. Dai, Z. Kozareva, B. Dai, A. Smola, and L. Song, “Learning steady-states of iterative algorithms over graphs,” in International Conference on Machine Learning, 2018, pp. 1114–1122.
- J. Chen, J. Zhu, and L. Song, “Stochastic training of graph convolutional networks with variance reduction.” in ICML 2018, 2018, pp. 941–949.
- Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” arXiv preprint arXiv:1801.07606, 2018.
- T. N. Kipf and M. Welling, “Variational graph auto-encoders,” arXiv preprint arXiv:1611.07308, 2016.
- R. van den Berg, T. N. Kipf, and M. Welling, “Graph convolutional matrix completion,” arXiv preprint arXiv:1706.02263, 2017.
- S. Pan, R. Hu, G. Long, J. Jiang, L. Yao, and C. Zhang, “Adversarially regularized graph autoencoder for graph embedding,” arXiv preprint arXiv:1802.04407, 2018.
- W. Yu, C. Zheng, W. Cheng, C. C. Aggarwal, D. Song, B. Zong, H. Chen, and W. Wang, “Learning deep network representations with adversarially regularized autoencoders,” in SIGKDD 2018, 2018.
- S. Cao, W. Lu, and Q. Xu, “Deep neural networks for learning graph representations,” in AAAI 2016, 2016.
- D. Wang, P. Cui, and W. Zhu, “Structural deep network embedding,” in SIGKDD 2016, 2016.
- K. Tu, P. Cui, X. Wang, P. S. Yu, and W. Zhu, “Deep recursive network embedding with regular equivalence,” in SIGKDD 2018, 2018.
- Y. Hoshen, “Vain: Attentional multi-agent predictive modeling,” in NIPS 2017, 2017, pp. 2701–2711.
- N. Watters, D. Zoran, T. Weber, P. Battaglia, R. Pascanu, and A. Tacchetti, “Visual interaction networks: Learning a physics simulator from video,” in NIPS 2017, 2017, pp. 4539–4547.
- M. B. Chang, T. Ullman, A. Torralba, and J. B. Tenenbaum, “A compositional object-based approach to learning physical dynamics,” arXiv preprint arXiv:1612.00341, 2016.
- S. Sukhbaatar, R. Fergus et al., “Learning multiagent communication with backpropagation,” in NIPS 2016, 2016, pp. 2244–2252.
- H. Dai, B. Dai, and L. Song, “Discriminative embeddings of latent variable models for structured data,” in ICML 2016, 2016, pp. 2702–2711.
- D. Raposo, A. Santoro, D. Barrett, R. Pascanu, T. Lillicrap, and P. Battaglia, “Discovering objects and their relations from entangled scene representations,” arXiv preprint arXiv:1702.05068, 2017.
- A. Santoro, D. Raposo, D. G. Barrett, M. Malinowski, R. Pascanu, P. Battaglia, and T. Lillicrap, “A simple neural network module for relational reasoning,” in NIPS 2017, 2017, pp. 4967–4976.
- M. Zaheer, S. Kottur, S. Ravanbakhsh, B. Poczos, R. R. Salakhutdinov, and A. J. Smola, “Deep sets,” in NIPS 2017, 2017, pp. 3391– 3401.
- C. R. Qi, H. Su, K. Mo, and L. J. Guibas, “Pointnet: Deep learning on point sets for 3d classification and segmentation,” CVPR 2017, vol. 1, no. 2, p. 4, 2017.
- S. Kearnes, K. McCloskey, M. Berndl, V. Pande, and P. Riley, “Molecular graph convolutions: moving beyond fingerprints,” Journal of computer-aided molecular design, vol. 30, no. 8, pp. 595– 608, 2016.
- K. T. Schutt, F. Arbabzadah, S. Chmiela, K. R. Muller, and A. Tkatchenko, “Quantum-chemical insights from deep tensor neural networks,” Nature communications, vol. 8, p. 13890, 2017.
- A. Buades, B. Coll, and J.-M. Morel, “A non-local algorithm for image denoising,” in CVPR 2005, vol.
- 2. IEEE, 2005, pp. 60–65.
- [102] C. Tomasi and R. Manduchi, “Bilateral filtering for gray and color images,” in Computer Vision 1998. IEEE, 1998, pp. 839–846.
- [103] T. Kipf, E. Fetaya, K.-C. Wang, M. Welling, and R. Zemel, “Neural relational inference for interacting systems,” arXiv preprint arXiv:1802.04687, 2018.
- [104] J. B. Hamrick, K. R. Allen, V. Bapst, T. Zhu, K. R. McKee, J. B. Tenenbaum, and P. W. Battaglia, “Relational inductive bias for physical construction in humans and machines,” arXiv preprint arXiv:1806.01203, 2018.
- [105] T. Wang, R. Liao, J. Ba, and S. Fidler, “Nervenet: Learning structured policy with graph neural networks,” 2018.
- [106] H. Peng, J. Li, Y. He, Y. Liu, M. Bao, L. Wang, Y. Song, and Q. Yang, “Large-scale hierarchical text classification with recursively regularized deep graph-cnn,” in WWW 2018, 2018, pp. 1063–1072.
- [107] L. Yao, C. Mao, and Y. Luo, “Graph convolutional networks for text classification,” arXiv preprint arXiv:1809.05679, 2018.
- [108] D. Marcheggiani and I. Titov, “Encoding sentences with graph convolutional networks for semantic role labeling,” in Proceedings of EMNLP, 2017, pp. 1506–1515.
- [109] J. Bastings, I. Titov, W. Aziz, D. Marcheggiani, and K. Simaan, “Graph convolutional encoders for syntax-aware neural machine translation,” EMNLP 2017, pp. 1957–1967, 2017.
- [110] D. Marcheggiani, J. Bastings, and I. Titov, “Exploiting semantics in neural machine translation with graph convolutional networks,” arXiv preprint arXiv:1804.08313, 2018.
- [111] M. Miwa and M. Bansal, “End-to-end relation extraction using lstms on sequences and tree structures,” arXiv preprint arXiv:1601.00770, 2016.
- [112] L. Song, Y. Zhang, Z. Wang, and D. Gildea, “N-ary relation extraction using graph state lstm,” arXiv preprint arXiv:1808.09101, 2018.
- [113] Y. Zhang, P. Qi, and C. D. Manning, “Graph convolution over pruned dependency trees improves relation extraction,” arXiv preprint arXiv:1809.10185, 2018.
- [114] T. H. Nguyen and R. Grishman, “Graph convolutional networks with argument-aware pooling for event detection,” 2018.
- [115] X. Liu, Z. Luo, and H. Huang, “Jointly multiple events extraction via attention-based graph information aggregation,” arXiv preprint arXiv:1809.09078, 2018.
- [116] L. Song, Y. Zhang, Z. Wang, and D. Gildea, “A graphto-sequence model for amr-to-text generation,” arXiv preprint arXiv:1805.02473, 2018.
- [117] L. Song, Z. Wang, M. Yu, Y. Zhang, R. Florian, and D. Gildea, “Exploring graph-structured passage representation for multihop reading comprehension with graph neural networks,” arXiv preprint arXiv:1809.02040, 2018.
- [118] R. B. Palm, U. Paquet, and O. Winther, “Recurrent relational networks,” NeurIPS 2018, 2018.
- [119] Z. Wang, T. Chen, J. Ren, W. Yu, H. Cheng, and L. Lin, “Deep reasoning with knowledge graph for social relationship understanding,” arXiv preprint arXiv:1807.00504, 2018.
- [120] V. Garcia and J. Bruna, “Few-shot learning with graph neural networks,” arXiv preprint arXiv:1711.04043, 2017.
- [121] X. Wang, Y. Ye, and A. Gupta, “Zero-shot recognition via semantic embeddings and knowledge graphs,” in CVPR 2018, 2018, pp. 6857–6866.
- [122] C. Lee, W. Fang, C. Yeh, and Y. F. Wang, “Multi-label zero-shot learning with structured knowledge graphs,” in Proceedings of CVPR, 2018, pp. 1576–1585.
- [123] K. Marino, R. Salakhutdinov, and A. Gupta, “The more you know: Using knowledge graphs for image classification,” in Proceedings of CVPR, 2017, pp. 20–28.
- [124] D. Teney, L. Liu, and A. V. Den Hengel, “Graph-structured representations for visual question answering,” in Proceedings of CVPR, 2017, pp. 3233–3241.
- [125] M. Narasimhan, S. Lazebnik, and A. G. Schwing, “Out of the box: Reasoning with graph convolution nets for factual visual question answering,” in Proceedings of NeurIPS, 2018, pp. 2654– 2665.
- [126] H. Hu, J. Gu, Z. Zhang, J. Dai, and Y. Wei, “Relation networks for object detection,” in CVPR 2018, vol. 2, no. 3, 2018.
- [127] J. Gu, H. Hu, L. Wang, Y. Wei, and J. Dai, “Learning region features for object detection,” arXiv preprint arXiv:1803.07066, 2018.
- [128] S. Qi, W. Wang, B. Jia, J. Shen, and S.-C. Zhu, “Learning humanobject interactions by graph parsing neural networks,” arXiv preprint arXiv:1808.07962, 2018.
- [129] X. Chen, L.-J. Li, L. Fei-Fei, and A. Gupta, “Iterative visual reasoning beyond convolutions,” arXiv preprint arXiv:1803.11189, 2018.
- [130] X. Liang, L. Lin, X. Shen, J. Feng, S. Yan, and E. P. Xing, “Interpretable structure-evolving lstm,” in CVPR 2017, 2017, pp. 2175–2184.
- [131] L. Landrieu and M. Simonovsky, “Large-scale point cloud semantic segmentation with superpoint graphs,” arXiv preprint arXiv:1711.09869, 2017.
- [132] Y. Wang, Y. Sun, Z. Liu, S. E. Sarma, M. M. Bronstein, and J. M. Solomon, “Dynamic graph cnn for learning on point clouds,” arXiv preprint arXiv:1801.07829, 2018.
- [133] X. Qi, R. Liao, J. Jia, S. Fidler, and R. Urtasun, “3d graph neural networks for rgbd semantic segmentation,” in CVPR 2017, 2017, pp. 5199–5208.
- [134] M. Zitnik, M. Agrawal, and J. Leskovec, “Modeling polypharmacy side effects with graph convolutional networks,” arXiv preprint arXiv:1802.00543, 2018.
- [135] S. Rhee, S. Seo, and S. Kim, “Hybrid approach of relation network and localized graph convolutional filtering for breast cancer subtype classification,” arXiv preprint arXiv:1711.05859, 2017.
- [136] Z. Wang, Q. Lv, X. Lan, and Y. Zhang, “Cross-lingual knowledge graph alignment via graph convolutional networks,” in EMNLP 2018, 2018, pp. 349–357.
- [137] A. Nowak, S. Villar, A. S. Bandeira, and J. Bruna, “Revised note on learning quadratic assignment with graph neural networks,” in IEEE DSW 2018. IEEE, 2018, pp. 1–5.
- [138] Z. Li, Q. Chen, and V. Koltun, “Combinatorial optimization with graph convolutional networks and guided tree search,” in NeurIPS 2018, 2018, pp. 537–546.
- [139] W. Kool and M. Welling, “Attention solves your tsp,” arXiv preprint arXiv:1803.08475, 2018.
- [140] O. Shchur, D. Zugner, A. Bojchevski, and S. Gunnemann, “Netgan: Generating graphs via random walks,” in Proceedings of ICML, 2018, pp. 609–618.
- [141] T. Ma, J. Chen, and C. Xiao, “Constrained generation of semantically valid graphs via regularizing variational autoencoders,” in NeurIPS 2018, 2018, pp. 7113–7124.
- [142] J. You, B. Liu, R. Ying, V. Pande, and J. Leskovec, “Graph convolutional policy network for goal-directed molecular graph generation,” arXiv preprint arXiv:1806.02473, 2018.
- [143] N. De Cao and T. Kipf, “Molgan: An implicit generative model for small molecular graphs,” arXiv preprint arXiv:1805.11973, 2018.
- [144] Z. Cui, K. Henrickson, R. Ke, and Y. Wang, “Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting,” 2018.
- [145] M. Allamanis, M. Brockschmidt, and M. Khademi, “Learning to represent programs with graphs,” arXiv preprint arXiv:1711.00740, 2017.
- [146] O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein et al., “Imagenet large scale visual recognition challenge,” IJCV 2015, vol. 115, no. 3, pp. 211–252, 2015.
- [147] W. Norcliffebrown, S. Vafeias, and S. Parisot, “Learning conditioned graph structures for interpretable visual question answering,” in Proceedings of NeurIPS, 2018, pp. 8334–8343.
- [148] J. You, R. Ying, X. Ren, W. Hamilton, and J. Leskovec, “Graphrnn: Generating realistic graphs with deep auto-regressive models,” in ICML 2018, 2018, pp. 5694–5703.
- [149] Y. Li, O. Vinyals, C. Dyer, R. Pascanu, and P. Battaglia, “Learning deep generative models of graphs,” arXiv preprint arXiv:1803.03324, 2018.
- [150] I. Bello, H. Pham, Q. V. Le, M. Norouzi, and S. Bengio, “Neural combinatorial optimization with reinforcement learning,” 2017.
- [151] O. Vinyals, M. Fortunato, and N. Jaitly, “Pointer networks,” in NIPS 2015, 2015, pp. 2692–2700.
- [152] R. S. Sutton and A. G. Barto, Reinforcement learning: An introduction. MIT press, 2018.
- [153] E. Khalil, H. Dai, Y. Zhang, B. Dilkina, and L. Song, “Learning combinatorial optimization algorithms over graphs,” in NIPS 2017, 2017, pp. 6348–6358.
Full Text
Tags
Comments