×

A gentle introduction to deep learning for graphs. (English) Zbl 1475.68310

Summary: The adaptive processing of graph data is a long-standing research topic that has been lately consolidated as a theme of major interest in the deep learning community. The snap increase in the amount and breadth of related research has come at the price of little systematization of knowledge and attention to earlier literature. This work is a tutorial introduction to the field of deep learning for graphs. It favors a consistent and progressive presentation of the main concepts and architectural aspects over an exposition of the most recent literature, for which the reader is referred to available surveys. The paper takes a top-down view of the problem, introducing a generalized formulation of graph representation learning based on a local and iterative approach to structured information processing. Moreover, it introduces the basic building blocks that can be combined to design novel and effective neural models for graphs. We complement the methodological exposition with a discussion of interesting research challenges and applications in the field.

MSC:

68T07 Artificial neural networks and deep learning
68R10 Graph theory (including graph drawing) in computer science
PDF BibTeX XML Cite
Full Text: DOI arXiv

References:

[1] Bacciu, Davide; Bruno, Antonio, Deep tree transductions - a short survey, (Recent advances in big data and deep learning (2020), Springer), 236-245
[2] Bacciu, Davide; Di Sotto, Luigi, A non-negative factorization approach to node pooling in graph convolutional neural networks, (AI*IA 2019 - Advances in artificial intelligence (2019), Springer), 294-306
[3] Bacciu, Davide; Errica, Federico; Micheli, Alessio, Contextual graph Markov model: A deep and generative approach to graph processing, (Proceedings of the 35th international conference on machine learning (ICML), Vol. 80 (2018), PMLR), 294-303
[4] Bacciu, Davide; Micheli, Alessio; Podda, Marco, Edge-based sequential graph generation with recurrent neural networks, Neurocomputing (2019), Accepted
[5] Bacciu, Davide, Micheli, Alessio, & Podda, Marco (2019b). Graph generation by sequential edge prediction. In Proceedings of the European symposium on artificial neural networks, computational intelligence and machine learning (ESANN).
[6] Bacciu, Davide; Micheli, Alessio; Sperduti, Alessandro, Compositional generative mapping for tree-structured data - part I: Bottom-up probabilistic modeling of trees, IEEE Transactions on Neural Networks and Learning Systems, 23, 12, 1987-2002 (2012), Publisher: IEEE
[7] Battaglia, Peter W.; Hamrick, Jessica B.; Bapst, Victor; Sanchez-Gonzalez, Alvaro; Zambaldi, Vinicius; Malinowski, Mateusz, Relational inductive biases, deep learning, and graph networks (2018), arXiv preprint arXiv:1806.01261
[8] Beck, Daniel, Haffari, Gholamreza, & Cohn, Trevor (2018). Graph-to-sequence learning using gated graph neural networks. In Proceedings of the 56th annual meeting of the association for computational linguistics (ACL), Volume 1 (long papers) (pp. 273-283).
[9] Bengio, Yoshua; Simard, Patrice; Frasconi, Paolo, Learning long-term dependencies with gradient descent is difficult, IEEE Transactions on Neural Networks, 5, 2, 157-166 (1994)
[10] Bianucci, Anna Maria; Micheli, Alessio; Sperduti, Alessandro; Starita, Antonina, Application of cascade correlation networks for structures to chemistry, Applied Intelligence: The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Technologies, 12, 1-2, 117-147 (2000), Publisher: Springer
[11] Biggio, Battista; Roli, Fabio, Wild patterns: Ten years after the rise of adversarial machine learning, Pattern Recognition, 84, 317-331 (2018), Publisher: Elsevier
[12] Blackledge, Jonathan M., Chapter 2 - 2d fourier theory, (Digital image processing (2005), Woodhead Publishing), 30-49
[13] Bobadilla, Jesús; Ortega, Fernando; Hernando, Antonio; Gutiérrez, Abraham, Recommender systems survey, Knowledge-Based Systems, 46, 109-132 (2013), Publisher: Elsevier
[14] Bojchevski, Aleksandar, Shchur, Oleksandr, Zügner, Daniel, & Günnemann, Stephan (2018). NetGAN: Generating graphs via random walks. In Proceedings of the 35th international conference on machine learning (ICML) (pp. 609-618).
[15] Bondy, John Adrian; Murty, Uppaluri Siva Ramachandra, Graph theory with applications, Vol. 290 (1976), Macmillan London
[16] Bongini, Marco; Rigutini, Leonardo; Trentin, Edmondo, Recursive neural networks for density estimation over generalized random graphs, IEEE Transactions on Neural Networks and Learning Systems, 29, 11, 5441-5458 (2018), Publisher: IEEE
[17] Borgwardt, Karsten M.; Ong, Cheng Soon; Schönauer, Stefan; Vishwanathan, S. V.N.; Smola, Alex J.; Kriegel, Hans-Peter, Protein function prediction via graph kernels, Bioinformatics, 21, suppl_1, i47-i56 (2005), Publisher: Oxford University Press
[18] Bradshaw, John, Paige, Brooks, Kusner, Matt J., Segler, Marwin, & Hernández-Lobato, José Miguel (2019). A model to search for synthesizable molecules. In Proceedings of the 33rd conference on neural information processing systems (NeurIPS) (pp. 7935-7947).
[19] Bronstein, Michael M.; Bruna, Joan; LeCun, Yann; Szlam, Arthur; Vandergheynst, Pierre, Geometric deep learning: going beyond Euclidean data, IEEE Signal Processing Magazine, 34, 4, 25 (2017), 18-42
[20] Bruna, Joan, Zaremba, Wojciech, Szlam, Arthur, & LeCun, Yann (2014). Spectral networks and locally connected networks on graphs. In Proceedings of the 2nd international conference on learning representations (ICLR).
[21] Calandriello, Daniele, Koutis, Ioannis, Lazaric, Alessandro, & Valko, Michal (2018). Improved large-scale graph learning through ridge spectral sparsification. In Proceedings of the 35th international conference on machine learning (ICML) (pp. 687-696).
[22] Chapelle, Olivier; Schölkopf, Bernhard; Zien, Alexander, Semi-supervised learning, IEEE Transactions on Neural Networks, 20, 3, 542 (2006)
[23] Chen, Jie, Ma, Tengfei, & Xiao, Cao (2018). FastGCN: Fast learning with graph convolutional networks via importance sampling. In Proceedings of the 6th international conference on learning representations (ICLR).
[24] Cho, Kyunghyun, van Merrienboer, Bart, Gülçehre, Çaglar, Bahdanau, Dzmitry, Bougares, Fethi, & Schwenk, Holger, et al. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Proceedings of the 2014 conference on empirical methods in natural language processing, (EMNLP) (pp. 1724-1734).
[25] Cortes, Corinna; Vapnik, Vladimir, Support-vector networks, Machine Learning, 20, 3, 273-297 (1995), Publisher: Springer · Zbl 0831.68098
[26] Cybenko, George, Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals and Systems, 2, 4, 303-314 (1989) · Zbl 0679.94019
[27] De Cao, Nicola; Kipf, Thomas, MolGAN: An implicit generative model for small molecular graphs, (Workshop on theoretical foundations and applications of deep generative models, international conference on machine learning (ICML) (2018))
[28] Debnath, Asim Kumar; Lopez de Compadre, Rosa L.; Debnath, Gargi; Shusterman, Alan J.; Hansch, Corwin, Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds correlation with molecular orbital energies and hydrophobicity, Journal of Medicinal Chemistry, 34, 2, 786-797 (1991), Publisher: ACS Publications
[29] Defferrard, Michaël, Bresson, Xavier, & Vandergheynst, Pierre (2016). Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the 30th conference on neural information processing systems (NIPS) (pp. 3844-3852).
[30] Dhillon, Inderjit S.; Guan, Yuqiang; Kulis, Brian, Weighted graph cuts without eigenvectors a multilevel approach, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29, 11, 1944-1957 (2007), Publisher: IEEE
[31] Dobson, Paul D.; Doig, Andrew J., Distinguishing enzyme structures from non-enzymes without alignments, Journal of Molecular Biology, 330, 4, 771-783 (2003), Publisher: Elsevier
[32] Duvenaud, David K., Maclaurin, Dougal, Iparraguirre, Jorge, Bombarelli, Rafael, Hirzel, Timothy, & Aspuru-Guzik, Alan, et al. (2015). Convolutional networks on graphs for learning molecular fingerprints. In Proceedings of the 29th conference on neural information processing systems (NIPS) (pp. 2224-2232).
[33] Erdős, Paul; Rényi, Alfréd, On the evolution of random graphs, Publications of the Mathematical Institute of the Hungarian Academy of Science, 5, 1, 17-60 (1960) · Zbl 0103.16301
[34] Errica, Federico, Podda, Marco, Bacciu, Davide, & Micheli, Alessio (2020). A fair comparison of graph neural networks for graph classification. In Proceedings of the 8th international conference on learning representations (ICLR).
[35] Fahlman, Scott E., & Lebiere, Christian (1990). The Cascade-Correlation learning architecture. In Proceedings of the 3rd conference on neural information processing systems (NIPS) (pp. 524-532).
[36] Fan, S.; Huang, B., Conditional labeled graph generation with GANs, (Workshop on representation learning on graphs and manifolds, international conference on learning representations (ICLR) (2019))
[37] Feng, Fuli; He, Xiangnan; Tang, Jie; Chua, Tat-Seng, Graph adversarial training: Dynamically regularizing based on graph structure, IEEE Transactions on Knowledge and Data Engineering (2019), Publisher: IEEE
[38] Feng, Yifan, You, Haoxuan, Zhang, Zizhao, Ji, Rongrong, & Gao, Yue (2019). Hypergraph neural networks. In Proceedings of the 33rd AAAI conference on artificial intelligence (AAAI), Vol. 33 (pp. 3558-3565).
[39] Fey, Matthias; Lenssen, Jan Eric, Fast graph representation learning with PyTorch Geometric, (Workshop on representation learning on graphs and manifolds, international conference on learning representations (ICLR) (2019))
[40] Frasconi, Paolo; Costa, Fabrizio; De Raedt, Luc; De Grave, Kurt, Klog: A language for logical and relational learning with kernels, Artificial Intelligence, 217, 117-143 (2014), Publisher: Elsevier · Zbl 1405.68288
[41] Frasconi, Paolo; Gori, Marco; Sperduti, Alessandro, A general framework for adaptive processing of data structures, IEEE Transactions on Neural Networks, 9, 5, 768-786 (1998), Publisher: IEEE
[42] Frederik Diehl, Michael Truong Le; Brunner, Thomas; Knoll, Alois, Towards graph pooling by edge contraction, (Workshop on learning and reasoning with graph-structured data, international conference on machine learning (ICML) (2019))
[43] Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert, The elements of statistical learning, Vol. 1 (2001), Springer series in statistics New York · Zbl 0973.62007
[44] Gallicchio, Claudio; Micheli, Alessio, Graph echo state networks, (Proceedings of the international joint conference on neural networks (IJCNN) (2010), IEEE), 1-8
[45] Gallicchio, Claudio, & Micheli, Alessio (2020). Fast and deep graph neural networks. In Proceedings of the 34th AAAI conference on artificial intelligence (AAAI).
[46] Gao, Hongyang, & Ji, Shuiwang (2019). Graph U-nets. In Proceedings of the 36th international conference on machine learning (ICML) (pp. 2083-2092).
[47] Gilbert, Edgar N., Random graphs, The Annals of Mathematical Statistics, 30, 4, 1141-1144 (1959) · Zbl 0168.40801
[48] Gilmer, Justin, Schoenholz, Samuel S., Riley, Patrick F., Vinyals, Oriol, & Dahl, George E. (2017). Neural message passing for quantum chemistry. In Proceedings of the 34th international conference on machine learning (ICML) (pp. 1263-1272).
[49] Goodfellow, Ian, Pouget-Abadie, Jean, Mirza, Mehdi, Xu, Bing, Warde-Farley, David, & Ozair, Sherjil, et al. (2014). Generative adversarial nets. In Proceedings of the 28th conference on neural information processing systems (NIPS) (pp. 2672-2680).
[50] Grover, Aditya; Leskovec, Jure, Node2vec: Scalable feature learning for networks, (Proceedings of the 22nd international conference on knowledge discovery and data mining (SIGKDD) (2016), ACM), 855-864
[51] Grover, Aditya, Zweig, Aaron, & Ermon, Stefano (2019). Graphite: Iterative generative modeling of graphs. In Proceedings of the 36th international conference on machine learning (ICML) (pp. 2434-2444).
[52] Hagenbuchner, Markus; Sperduti, Alessandro; Tsoi, Ah Chung, A self-organizing map for adaptive processing of structured data, IEEE Transactions on Neural Networks, 14, 3, 491-505 (2003) · Zbl 0997.68100
[53] Hagenbuchner, Markus; Sperduti, Alessandro; Tsoi, Ah Chung, Graph self-organizing maps for cyclic and unbounded graphs, Neurocomputing, 72, 7-9, 1419-1430 (2009)
[54] Hamilton, Will, Ying, Zhitao, & Leskovec, Jure (2017a). Inductive representation learning on large graphs. In Proceedings of the 31st conference on neural information processing systems (NIPS) (pp. 1024-1034).
[55] Hamilton, William L.; Ying, Rex; Leskovec, Jure, Representation learning on graphs: Methods and applications, IEEE Data Engineering Bulletin, 40, 3, 52-74 (2017)
[56] Hammer, Barbara; Micheli, Alessio; Sperduti, Alessandro, Universal approximation capability of cascade correlation for structures, Neural Computation, 17, 5, 1109-1159 (2005) · Zbl 1096.68132
[57] Hammer, Barbara; Micheli, Alessio; Sperduti, Alessandro; Strickert, Marc, A general framework for unsupervised processing of structured data, Neurocomputing, 57, 3-35 (2004), Publisher: Elsevier
[58] Hammer, Barbara; Micheli, Alessio; Sperduti, Alessandro; Strickert, Marc, Recursive self-organizing network models, Neural Networks, 17, 8-9, 1061-1085 (2004), Publisher: Elsevier · Zbl 1101.68761
[59] Hammond, David K.; Vandergheynst, Pierre; Gribonval, Rémi, Wavelets on graphs via spectral graph theory, Applied and Computational Harmonic Analysis, 30, 2, 129-150 (2011), Publisher: Elsevier · Zbl 1213.42091
[60] Helma, Christoph; King, Ross D.; Kramer, Stefan; Srinivasan, Ashwin, The predictive toxicology challenge 2000-2001, Bioinformatics, 17, 1, 107-108 (2001), Publisher: Oxford University Press
[61] Hochreiter, Sepp, Untersuchungen zu dynamischen neuronalen netzen, Diploma, Technische Universität München, 91, 1 (1991)
[62] Hochreiter, Sepp; Schmidhuber, Jürgen, Long short-term memory, Neural Computation, 9, 8, 1735-1780 (1997), Publisher: MIT Press
[63] Iadarola, Giacomo, Graph-based classification for detecting instances of bug patterns (2018), University of Twente, (Master’s thesis)
[64] Ivanov, Sergey, & Burnaev, Evgeny (2018). Anonymous walk embeddings. In Proceedings of the 35th international conference on machine learning (ICML) (pp. 2191-2200).
[65] Jang, Eric, Gu, Shixiang, & Poole, Ben (2017). Categorical reparametrization with gumbel-softmax. In Proceedings of the 5th international conference on learning representations (ICLR).
[66] Jeon, Woosung; Kim, Dongsup, FP2VEC: A new molecular featurizer for learning molecular properties, Bioinformatics, 35, 23, 4979-4985 (2019)
[67] Jiang, Jianwen, Wei, Yuxuan, Feng, Yifan, Cao, Jingxuan, & Gao, Yue (2019). Dynamic hypergraph neural networks. In Proceedings of the 28th international joint conference on artificial intelligence (IJCAI) (pp. 2635-2641).
[68] Jin, Wengong, Barzilay, Regina, & Jaakkola, Tommi S. (2018). Junction tree variational autoencoder for molecular graph generation. In Proceedings of the 35th international conference on machine learning (ICML) (pp. 2328-2337).
[69] Jin, H.; Zhang, X., Latent adversarial training of graph convolution networks, (Workshop on learning and reasoning with graph-structured representations, international conference on machine learning (ICML) (2019))
[70] Kingma, Diederik P., & Welling, Max (2014). Auto-encoding variational Bayes. In Proceedings of the 2nd international conference on learning representations (ICLR). · Zbl 1431.68002
[71] Kipf, Thomas N.; Welling, Max, Variational graph auto-encoders, (Workshop on Bayesian deep learning, neural information processing system (NIPS) (2016))
[72] Kipf, Thomas N., & Welling, Max (2017). Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th international conference on learning representations (ICLR).
[73] Kohonen, Teuvo, The self-organizing map, Proceedings of the IEEE, 78, 9, 1464-1480 (1990) · Zbl 0917.68176
[74] Kwon, Youngchun; Yoo, Jiho; Choi, Youn-Suk; Son, Won-Joon; Lee, Dongseon; Kang, Seokho, Efficient learning of non-autoregressive graph variational autoencoders for molecular graph generation, Journal of Cheminformatics, 11, 1, 70 (2019), Publisher: Springer
[75] LeCun, Yann; Bengio, Yoshua, Convolutional networks for images, speech, and time series, The Handbook of Brain Theory and Neural Networks, 3361, 10, 1995 (1995)
[76] Lee, Junhyun, Lee, Inyeop, & Kang, Jaewoo (2019). Self-attention graph pooling. In Proceedings of the 36th international conference on machine learning (ICML) (pp. 3734-3743).
[77] Li, Qimai, Han, Zhichao, & Wu, Xiao-Ming (2018). Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the 32nd AAAI conference on artificial intelligence (AAAI).
[78] Li, Yujia, Tarlow, Daniel, Brockschmidt, Marc, & Zemel, Richard S. (2016). Gated graph sequence neural networks. In Proceedings of the 4th international conference on learning representations, (ICLR).
[79] Li, Yujia; Vinyals, Oriol; Dyer, Chris; Pascanu, Razvan; Battaglia, Peter W., Learning deep generative models of graphs (2018), CoRR, abs/1803.03324
[80] Liu, Qi, Allamanis, Miltiadis, Brockschmidt, Marc, & Gaunt, Alexander (2018). Constrained graph variational autoencoders for molecule design. In Proceedings of the 32nd conference on neural information processing systems (NeurIPS) (pp. 7795-7804).
[81] Lovász, László, Random walks on graphs: A survey, Combinatorics, Paul Erdos is Eighty, 2, 1, 1-46 (1993)
[82] Maas, Andrew L.; Hannun, Awni Y.; Ng, Andrew Y., Rectifier nonlinearities improve neural network acoustic models, (Workshop on deep learning for audio, speech and language processing, international conference on machine learning (ICML) (2013))
[83] Macskassy, Sofus A.; Provost, Foster, Classification in networked data: A toolkit and a univariate case study, Journal of Machine Learning Research (JMLR), 8, May, 935-983 (2007)
[84] Marcheggiani, Diego, Bastings, Joost, & Titov, Ivan (2018). Exploiting semantics in neural machine translation with graph convolutional networks. In Proceedings of the 2018 conference of the north american chapter of the association for computational linguistics: Human language technologies (NAACL-HLT), Volume 2 (short papers) (pp. 486-492).
[85] Marcheggiani, Diego, & Titov, Ivan (2017). Encoding sentences with graph convolutional networks for semantic role labeling. In Proceedings of the 2017 conference on empirical methods in natural language processing (EMNLP) (pp. 1506-1515).
[86] Massarelli, Luca; Di Luna, Giuseppe Antonio; Petroni, Fabio; Baldoni, Roberto; Querzoni, Leonardo, Safe: Self-attentive function embeddings for binary similarity, (Proceedings of the 16th international conference on detection of intrusions and malware, and vulnerability assessment (DIMVA) (2019), Springer), 309-329
[87] Micheli, Alessio, Neural network for graphs: A contextual constructive approach, IEEE Transactions on Neural Networks, 20, 3, 498-511 (2009), Publisher: IEEE
[88] Micheli, Alessio; Sona, Diego; Sperduti, Alessandro, Contextual processing of structured data by recursive cascade correlation, IEEE Transactions on Neural Networks, 15, 6, 1396-1410 (2004), Publisher: IEEE
[89] Micheli, Alessio; Sperduti, Alessandro; Starita, Antonina, An introduction to recursive neural networks and kernel methods for cheminformatics, Current Pharmaceutical Design, 13, 14, 1469-1496 (2007)
[90] Mishra, Pushkar, Yannakoudakis, Helen, & Shutova, Ekaterina (2018). Neural character-based composition models for abuse detection. In Proceedings of the 2nd workshop on abusive language online (ALW2) (pp. 1-10).
[91] Monti, Federico, Bronstein, Michael M., & Bresson, Xavier (2017). Geometric matrix completion with recurrent multi-graph neural networks. In Proceedings of the 31st international conference on neural information processing systems (pp. 3700-3710).
[92] Namata, Galileo Mark, London, Ben, Getoor, Lise, & Huang, Bert (2012). Query-driven active surveying for collective classification. In Proceedings of the workshop on mining and learning with graphs.
[93] Nechaev, Yaroslav; Corcoglioniti, Francesco; Giuliano, Claudio, Sociallink: exploiting graph embeddings to link DBpedia entities to Twitter profiles, Progress in Artificial Intelligence, 7, 4, 251-272 (2018), Publisher: Springer
[94] Neuhaus, Michel; Bunke, Horst, Self-organizing maps for learning the edit costs in graph matching, IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 35, 3, 503-514 (2005)
[95] Perozzi, Bryan; Al-Rfou, Rami; Skiena, Steven, Deepwalk: Online learning of social representations, (Proceedings of the 20th international conference on knowledge discovery and data mining (SIGKDD) (2014), ACM), 701-710
[96] Qu, Meng, Bengio, Yoshua, & Tang, Jian (2019). GMNN: Graph Markov neural networks. In Proceedings of the 36th international conference on machine learning (ICML) (pp. 5241-5250).
[97] Ralaivola, Liva; Swamidass, Sanjay J.; Saigo, Hiroto; Baldi, Pierre, Graph kernels for chemical informatics, Neural Networks, 18, 8, 1093-1110 (2005), Publisher: Elsevier
[98] Ribeiro, Leonardo F. R.; Saverese, Pedro H. P.; Figueiredo, Daniel R., Struc2vec: Learning node representations from structural identity, (Proceedings of the 23rd international conference on knowledge discovery and data mining (SIGKDD) (2017), ACM), 385-394
[99] Sadhanala, Veeru; Wang, Yu-Xiang; Tibshirani, Ryan, Graph sparsification approaches for laplacian smoothing, (Artificial intelligence and statistics (2016)), 1250-1259
[100] Samanta, Bidisha, De, Abir, Jana, Gourhari, Chattaraj, Pratim Kumar, Ganguly, Niloy, & Rodriguez, Manuel Gomez (2019). NeVAE: A deep generative model for molecular graphs. In Proceedings of the 33rd AAAI conference on artificial intelligence (AAAI) (pp. 1110-1117).
[101] San Kim, Tae; Lee, Won Kyung; Sohn, So Young, Graph convolutional network approach applied to predict hourly bike-sharing demands considering spatial, temporal, and global effects, PloS One, 14, 9 (2019), Publisher: Public Library of Science
[102] Saul, Lawrence K.; Jordan, Michael I., Mixed memory Markov models: Decomposing complex stochastic processes as mixtures of simpler ones, Machine Learning, 37, 1, 75-87 (1999), Publisher: Springer · Zbl 0948.68096
[103] Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele, The graph neural network model, IEEE Transactions on Neural Networks, 20, 1, 61-80 (2009), Publisher: IEEE
[104] Schlichtkrull, Michael; Kipf, Thomas N.; Bloem, Peter; van den Berg, Rianne; Titov, Ivan; Welling, Max, Modeling relational data with graph convolutional networks, (Proceedings of the 15th european semantic web conference (ESWC) (2018), Springer), 593-607
[105] Schomburg, Ida; Chang, Antje; Ebeling, Christian; Gremse, Marion; Heldt, Christian; Huhn, Gregor, BRENDA, the enzyme database: updates and major new developments, Nucleic Acids Research, 32, suppl_1 (2004)
[106] Sen, Prithviraj; Namata, Galileo; Bilgic, Mustafa; Getoor, Lise; Galligher, Brian; Eliassi-Rad, Tina, Collective classification in network data, AI Magazine, 29, 3, 93 (2008)
[107] Shchur, Oleksandr; Mumme, Maximilian; Bojchevski, Aleksandar; Günnemann, Stephan, Pitfalls of graph neural network evaluation, (Workshop on relational representation learning, neural information processing systems (NeurIPS) (2018))
[108] Shervashidze, Nino; Schweitzer, Pascal; Leeuwen, Erik Jan van; Mehlhorn, Kurt; Borgwardt, Karsten M., Weisfeiler-lehman graph kernels, Journal of Machine Learning Research (JMLR), 12, Sep, 2539-2561 (2011) · Zbl 1280.68194
[109] Shervashidze, Nino, Vishwanathan, SVN, Petri, Tobias, Mehlhorn, Kurt, & Borgwardt, Karsten (2009). Efficient graphlet kernels for large graph comparison. In Proceedings of the 12th international conference on artificial intelligence and statistics (AISTATS) (pp. 488-495).
[110] Simonovsky, Martin, & Komodakis, Nikos (2017). Dynamic edge-conditioned filters in convolutional neural networks on graphs. In Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR) (pp. 3693-3702).
[111] Simonovsky, Martin, & Komodakis, Nikos GraphVAE: Towards generation of small graphs using variational autoencoders. In Proceedings of the 27th international conference on artificial neural networks (ICANN) (pp. 412-422).
[112] Socher, Richard, Lin, Cliff C., Manning, Chris, & Ng, Andrew Y. (2011). Parsing natural scenes and natural language with recursive neural networks. In Proceedings of the 28th international conference on machine learning (ICML) (pp. 129-136).
[113] Sperduti, Alessandro; Starita, Antonina, Supervised neural networks for the classification of structures, IEEE Transactions on Neural Networks, 8, 3, 714-735 (1997), Publisher: IEEE
[114] Tai, Kai Sheng, Socher, Richard, & Manning, Christopher D. (2015). Improved semantic representations from tree-structured Long Short-Term Memory networks. In Proceedings of the 53rd annual meeting of the association for computational linguistics (ACL) (pp. 1556-1566).
[115] Tolstikhin, Ilya, Bousquet, Olivier, Gelly, Sylvain, & Schoelkopf, Bernhard (2018). Wasserstein auto-encoders. In Proceedings of the 6th international conference on learning representations (ICLR).
[116] Trentin, Edmondo; Di Iorio, Ernesto, Nonparametric small random networks for graph-structured pattern recognition, Neurocomputing, 313, 14-24 (2018)
[117] Trentin, Edmondo; Rigutini, Leonardo, A maximum-likelihood connectionist model for unsupervised learning over graphical domains, (Proceedings of the 12th international conference on artificial neural networks (ICANN) (2009), Springer), 40-49
[118] Vaswani, Ashish, Shazeer, Noam, Parmar, Niki, Uszkoreit, Jakob, Jones, Llion, & Gomez, Aidan N., et al. (2017). Attention is all you need. In Proceedings of the 31st conference on neural information processing systems (NIPS) (pp. 5998-6008).
[119] Velickovic, Petar, Cucurull, Guillem, Casanova, Arantxa, Romero, Adriana, Lio, Pietro, & Bengio, Yoshua (2018). Graph attention networks. In Proceedings of the 6th international conference on learning representations (ICLR).
[120] Velickovic, Petar, Fedus, William, Hamilton, William L., Liò, Pietro, Bengio, Yoshua, & Hjelm, R. Devon (2019). Deep graph infomax. In Proceedings of the 7th international conference on learning representations (ICLR), New Orleans, la, USA, May 6-9, 2019.
[121] Vishwanathan, S. Vichy N.; Schraudolph, Nicol N.; Kondor, Risi; Borgwardt, Karsten M., Graph kernels, Journal of Machine Learning Research (JMLR), 11, Apr, 1201-1242 (2010) · Zbl 1242.05112
[122] Von Luxburg, Ulrike, A tutorial on spectral clustering, Statistics and Computing, 17, 4, 395-416 (2007), Publisher: Springer
[123] Wagstaff, Edward, Fuchs, Fabian B., Engelcke, Martin, Posner, Ingmar, & Osborne, Michael (2019). On the limitations of representing functions on sets. In Proceedings of the 36th international conference on machine learning (ICML) (pp. 6487-6494).
[124] Wale, Nikil; Watson, Ian A.; Karypis, George, Comparison of descriptor spaces for chemical compound retrieval and classification, Knowledge and Information Systems, 14, 3, 347-375 (2008), Publisher: Springer
[125] Wang, Xiaolong, & Gupta, Abhinav (2018). Videos as space-time region graphs. In Proceedings of the 15th European conference on computer vision (ECCV) (pp. 399-417).
[126] Wang, Yue; Sun, Yongbin; Liu, Ziwei; Sarma, Sanjay E.; Bronstein, Michael M.; Solomon, Justin M., Dynamic graph cnn for learning on point clouds, ACM Transactions on Graphics, 38, 5, 146 (2019), Publisher: ACM
[127] Wang, Hongwei, Wang, Jia, Wang, Jialin, Zhao, Miao, Zhang, Weinan, & Zhang, Fuzheng, et al. GraphGAN: Graph representation learning with generative adversarial nets. In Proceedings of the 32nd AAAI conference on artificial intelligence (AAAI) (pp. 2508-2515).
[128] Wang, Minjie; Yu, Lingfan; Zheng, Da; Gan, Quan; Gai, Yu; Ye, Zihao, Deep graph library: Towards efficient and scalable deep learning on graphs, (Workshop on representation learning on graphs and manifolds, international conference on learning representations (ICLR) (2019))
[129] Wu, Zonghan; Pan, Shirui; Chen, Fengwen; Long, Guodong; Zhang, Chengqi; Yu, Philip S., A comprehensive survey on graph neural networks (2019), CoRR, abs/1901.00596
[130] Xu, Keyulu, Hu, Weihua, Leskovec, Jure, & Jegelka, Stefanie (2019). How powerful are graph neural networks? In Proceedings of the 7th international conference on learning representations (ICLR).
[131] Xu, Keyulu, Li, Chengtao, Tian, Yonglong, Sonobe, Tomohiro, Kawarabayashi, Ken-ichi, & Jegelka, Stefanie (2018). Representation learning on graphs with jumping knowledge networks. In Proceedings of the 35th international conference on machine learning (ICML) (pp. 5453-5462).
[132] Yanardag, Pinar; Vishwanathan, S. V.N., Deep graph kernels, (Proceedings of the 21th international conference on knowledge discovery and data mining (SIGKDD (2015), ACM), 1365-1374
[133] Yang, Liang, Kang, Zesheng, Cao, Xiaochun, Jin, Di, Yang, Bo, & Guo, Yuanfang Topology optimization based graph convolutional network. In Proceedings of the 28th international joint conference on artificial intelligence (IJCAI) (pp. 4054-4061).
[134] Yin, Ruiping; Li, Kan; Zhang, Guangquan; Lu, Jie, A deeper graph neural network for recommender systems, Knowledge-Based Systems, 185, 105020 (2019), Publisher: Elsevier
[135] Ying, Rex; He, Ruining; Chen, Kaifeng; Eksombatchai, Pong; Hamilton, William L.; Leskovec, Jure, Graph convolutional neural networks for web-scale recommender systems, (Proceedings of the 24th international conference on knowledge discovery and data mining (SIGKDD) (2018), ACM), 974-983
[136] Ying, Zhitao, You, Jiaxuan, Morris, Christopher, Ren, Xiang, Hamilton, Will, & Leskovec, Jure (2018). Hierarchical graph representation learning with differentiable pooling. In Proceedings of the 32nd conference on neural information processing systems (NeurIPS).
[137] You, Jiaxuan, Ying, Rex, Ren, Xiang, Hamilton, William L., & Leskovec, Jure (2018). GraphRNN: Generating realistic graphs with deep auto-regressive models. In Proceedings of the 35th international conference on machine learning (ICML).
[138] Yu, Bing, Yin, Haoteng, & Zhu, Zhanxing (2018). Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In Proceedings of the 27th international joint conference on artificial intelligence (IJCAI).
[139] Zaheer, Manzil, Kottur, Satwik, Ravanbakhsh, Siamak, Poczos, Barnabas, Salakhutdinov, Ruslan R., & Smola, Alexander J. (2017). Deep sets. In Proceedings of the 31st conference on neural information processing systems (NIPS) (pp. 3391-3401).
[140] Zambon, Daniele; Alippi, Cesare; Livi, Lorenzo, Concept drift and anomaly detection in graph streams, IEEE Transactions on Neural Networks and Learning Systems, 29, 11, 5592-5605 (2018), Publisher: IEEE
[141] Zhang, Muhan, Cui, Zhicheng, Neumann, Marion, & Chen, Yixin (2018). An end-to-end deep learning architecture for graph classification. In Proceedings of the 32nd AAAI conference on artificial intelligence (AAAI).
[142] Zhang, Ziwei; Cui, Peng; Zhu, Wenwu, Deep learning on graphs: A survey (2018), CoRR, abs/1812.04202
[143] Zhang, Zizhao, Lin, Haojie, Gao, Yue, & BNRist, KLISS (2018). Dynamic hypergraph structure learning. In Proceedings of the 27th international joint conference on artificial intelligence (IJCAI) (pp. 3162-3169).
[144] Zhang, Si; Tong, Hanghang; Xu, Jiejun; Maciejewski, Ross, Graph convolutional networks: a comprehensive review, Computational Social Networks, 6, 1, 11 (2019), Publisher: Springer
[145] Zhou, Dengyong, Huang, Jiayuan, & Schölkopf, Bernhard (2007). Learning with hypergraphs: Clustering, classification, and embedding. In Proceedings of the 21st conferece on neural information processing systems (NIPS)(pp. 1601-1608).
[146] Zitnik, Marinka; Agrawal, Monica; Leskovec, Jure, Modeling polypharmacy side effects with graph convolutional networks, Bioinformatics, 34, 13, i457-i466 (2018)
[147] Zügner, Daniel; Akbarnejad, Amir; Günnemann, Stephan, Adversarial attacks on neural networks for graph data, (Proceedings of the 24th international conference on knowledge discovery and data mining (SIGKDD) (2018), ACM), 2847-2856
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.