Graph Structure and Isomorphism Learning: A Survey
Abstract
With the great success of artificial intelligence in recent years, graph learning is gaining attention from both academia and industry [1, 2]. The power of graph data is its capacity to represent numerous complicated structures in a broad spectrum of application domains including protein networks, social networks, food webs, molecular structures, knowledge graphs, sentence dependency trees, and scene graphs of images. However, designing an effective graph learning architecture on arbitrary graphs is still an on-going research topic because of two challenges of learning complex topological structures of graphs and their nature of isomorphism. In this work, we aim to summarize and discuss the latest methods in graph learning, with special attention to two aspects of structure learning and permutation invariance learning. The survey starts by reviewing basic concepts on graph theory and graph signal processing. Next, we provide systematic categorization of graph learning methods to address two aspects above respectively. Finally, we conclude our paper with discussions and open issues in research and practice.
References
on Artificial Intelligence, pp. 1–1, 2021.
[2] Z. Zhang, P. Cui, and W. Zhu, "Deep learning on graphs: A survey," 2018.
[3] J. Leskovec, "Representation Learning on Networks,"WWW-18 Tutorial, 2018. [Online]. Available:
http://snap.stanford.edu/proj/embeddings-www/
[4] M. Zhang, Z. Cui, M. Neumann, and Y. Chen, "An End-to-End Deep Learning Architecture for Graph Classification," in Proceedings of the 32nd Conference on Artificial Intelligence (AAAI), New Orleans, Louisiana, USA, February 2-7 2018, pp. 4438–4445.
[5] W. Cao, Z. Yan, Z. He, and Z. He, "A comprehensive survey on geometric deep learning," IEEE Access, vol. 8, pp. 35 929–35 949, 2020.
[6] M. Bronstein, J. Bruna, Y. Lecun, A. Szlam, and P. Vandergheynst, "Geometric deep learning: Going beyond euclidean data," IEEE Signal Processing Magazine, vol. 34, no. 4, pp. 18–42, Jul. 2017.
[7] W. L. Hamilton, R. Ying, and J. Leskovec, "Representation Learning on Graphs: Methods and Applications," IEEE Data Eng. Bull., vol. 40, no. 3, pp. 52–74, 2017.
[8] H. Cai, V. Zheng, and K. Chang, "A comprehensive survey of graph embedding: Problems, techniques, and applications," IEEE Transactions on Knowledge and Data Engineering, vol. 30, pp. 1616–1637, 2018.
[9] M. Kinderkhedia, "Learning representations of graph data: A survey," 2019.
[10] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun, "Graph neural networks: A review of methods and applications," CoRR, 2018.
[11] Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, "A Comprehensive Survey on Graph Neural Networks," CoRR, 2019.
[12] Z. Chen, F. Chen, L. Zhang, T. Ji, K. Fu, L. Zhao, F. Chen, and C. Lu, "Bridging the gap between spatial and spectral domains: A survey on graph neural networks," CoRR, 2020.
[13] K. Borgwardt, E. Ghisu, F. Llinares-López, L. O’Bray, and B. Rieck, "Graph kernels: State-of-the-art and future challenges," Foundations and Trends in Machine Learning, vol. 13, no. 5-6, pp. 531–712, 2020.
[14] G. Nikolentzos, G. Siglidis, and M. Vazirgiannis, "Graph kernels: A survey," CoRR, vol. abs/1904.12218, 2019.
[15] Y. Zhu, W. Xu, J. Zhang, Q. Liu, S. Wu, and L. Wang, "Deep graph structure learning for robust representations: A survey," CoRR, 2021.
[16] C. Zhuang and Q. Ma, "Dual graph convolutional networks for graph-based semi-supervised classification," in Proceedings of World Wide Web Conference (WWW), Lyon, France, 2018, pp. 499–508.
[17] S. Cao, W. Lu, and Q. Xu, "Deep neural networks for learning graph representations," Proceedings of the Conference on Artificial Intelligence (AAAI), vol. 30, no. 1, Feb. 2016.
[18] Y. Yang, H. Chen, and J. Shao, "Triplet enhanced autoencoder: Model-free discriminative network embedding," in Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI), Macao, China, August 10-16 2019, pp. 5363–5369.
[19] F. R. K. Chung, Spectral Graph Theory, Providence, RI, 1997.
[20] M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to the Theory of NPCompleteness. W. H. Freeman, 1979.
[21] D. Bieber, "The Weisfeiler-Lehman Isomorphism test," 2019. [Online]. Available:https://davidbieber.com/post/2019-05-10-weisfeiler-lehman-isomorphism-test/
[22] B. Weisfeiler and A. Leman, "The reduction of a graph to canonical form and the algebra which appears therein," Nauchno-Technicheskaya Informatsia, 1968.
[23] L. Babai, P. Erdös, and S. M. Selkow, "Random graph isomorphism," SIAM J. Comput., vol. 9, no. 3, pp. 628–635, 1980.
[24] M. Niepert, M. Ahmed, and K. Kutzkov, "Learning Convolutional Neural Networks for Graphs," in Proceedings of The 33rd International Conference on Machine Learning (ICML), vol. 48, New York, New York, USA, 20–22 Jun 2016, pp. 2014–2023.
[25] M. D. G. Mallea, P. Meltzer, and P. J. Bentley, "Capsule Neural Networks for Graph Classification using Explicit Tensorial Graph Representations," arXiv, 2019.
[26] H. Jin, W. Yu, and S. Li, "Graph regularized nonnegative matrix tri-factorization for overlapping community detection," Physica A: Statistical Mechanics and its Applications, vol. 515, no. C, pp. 376–387, 2019.
[27] M. Huang, Q. Jiang, Q. Qu, and A. Rasool, "An overlapping community detection approach in egosplitting networks using symmetric nonnegative matrix
factorization," Symmetry, vol. 13, no. 5, 2021.
[28] G. Ceddia, P. Pinoli, S. Ceri, and M. Masseroli, "Matrix factorization-based technique for drug repurposing predictions," IEEE J. Biomed. Health Informatics, vol. 24, no. 11, pp. 3162–3172, 2020.
[29] A. Mitra, P. Vijayan, S. Parthasarathy, and B. Ravindran, "A unified non-negative matrix factorization framework for semi supervised learning on graphs," in Proceedings of International Conference on Data Mining (SDM), Cincinnati, Ohio, USA, May 7-9 2020,
pp. 487–495.
[30] F. Ye, C. Chen, and Z. Zheng, "Deep Autoencoderlike Nonnegative Matrix Factorization for Community Detection," in Proceedings of the 27th International Conference on Information and Knowledge Management (CIKM), USA, 2018, pp. 1393–1402.
[31] J. Qiu, Y. Dong, H. Ma, J. Li, C. Wang, K. Wang, and J. Tang, "NetSMF: Large-scale network embedding as sparse matrix factorization," in The World Wide Web Conference (WWW), San Francisco, CA, USA, May 13-17 2019, pp. 1509–1520.
[32] W. L. Hamilton, "Graph representation learning," Synthesis Lectures on Artificial Intelligence and Machine Learning, vol. 14, no. 3, pp. 1–159, 2020.
[33] M. Ou, P. Cui, J. Pei, Z. Zhang, and W. Zhu, "Asymmetric transitivity preserving graph embedding," in Proceedings of International Conference on Knowledge Discovery and Data Mining (SIGKDD), San Francisco, CA, USA, August 13-17 2016, pp. 1105–1114.
[34] A. Grover and J. Leskovec, "node2vec: Scalable feature learning for networks," in Proceedings of International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2016, pp. 855–864.
[35] B. Perozzi, R. Al-Rfou, and S. Skiena, "Deepwalk: Online learning of social representations," in Proceedings of International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2014, pp. 701–710.
[36] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei, "LINE: large-scale information network embedding," in Proceedings of the 24th International
Conference on World Wide Web (WWW), Florence, Italy, May 18-22 2015, pp. 1067–1077.
[37] A. Micheli, "Neural network for graphs: A contextual constructive approach," IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 498–511, 2009.
[38] T. N. Kipf and M. Welling, "Semi-supervised classification with graph convolutional networks," in International Conference on Learning Representations (ICLR), 2017.
[39] M. Zhang, Z. Cui, M. Neumann, and Y. Chen, "An endto-end deep learning architecture for graph classification," in AAAI, 2018.
[40] B. Jiang, Z. Zhang, D. Lin, J. Tang, and B. Luo, "Semisupervised learning with graph learning-convolutional networks," in Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, June 16-20 2019, pp. 11 313–11 320.
[41] P. Velickovi ˇ c, G. Cucurull, A. Casanova, A. Romero, ´
P. Liò, and Y. Bengio, "Graph attention networks," International Conference on Learning Representations (ICLR), 2017.
[42] Z. Zhang, P. Cui, J. Pei, X. Wang, and W. Zhu, "Eigengnn: A graph structure preserving plug-in for gnns," CoRR, 2020.
[43] F. Errica, M. Podda, D. Bacciu, and A. Micheli, "A fair comparison of graph neural networks for graph classification," in Proceedings of International Conference on Learning Representations (ICLR), 2020.
[44] D. Shuman, S. Narang, P. Frossard, A. Ortega, and P. Vandergheynst, "The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains," IEEE Signal Processing Magazine, vol. 30, pp. 83–98, 2013.
[45] A. Sandryhaila and J. M. F. Moura, "Discrete signal processing on graphs: Graph Fourier transform," in 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 2013, pp. 6167–6170.
[46] S. Chen, R. Varma, A. Sandryhaila, and J. Kovacevic, "Discrete signal processing on graphs: Sampling the ory," IEEE Transactions on Signal Processing, vol. 63, no. 24, pp. 6510–6523, 2015.
[47] J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, "Spectral networks and locally connected networks on graphs," in International Conference on Learning Representations (ICLR), Banff, AB, Canada, April 14-16 2014.
[48] M. Defferrard, X. Bresson, and P. Vandergheynst, "Convolutional neural networks on graphs with fast localized spectral filtering," in Advances in Neural Information Processing Systems (NIPS), D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett, Eds., vol. 29, 2016.
[49] R. Levie, F. Monti, X. Bresson, and M. M. Bronstein, "Cayleynets: Graph convolutional neural networks with complex rational spectral filters," IEEE Transactions on Signal Processing, vol. 67, no. 1, pp. 97–109, 2019.
[50] S. Yu, Y. Feng, D. Zhang, H. D. Bedru, B. Xu, and F. Xia, "Motif discovery in networks: A survey,"Computer Science Review, vol. 37, p. 100267, 2020.
[51] N. Przulj, D. G. Corneil, and I. Jurisica, "Modeling interactome: scale-free or geometric?." Bioinform., vol. 20, no. 18, pp. 3508–3515, 2004.
[52] N. Shervashidze, S. Vishwanathan, T. Petri, K. Mehlhorn, and K. Borgwardt, "Efficient graphlet kernels for large graph comparison," in Proceedings of International Conference on Artificial Intelligence and Statistics (AISTAT), vol. 5, Clearwater Beach, Florida USA, 16–18 Apr 2009, pp. 488–495.
[53] B. Schölkopf and A. J. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond, 2002.
[54] P. Yanardag and S. Vishwanathan, "Deep Graph Kernels," in Proceedings of International Conference on Knowledge Discovery and Data Mining (KDD), USA, 2015, pp. 1365–1374.
[55] J. Ramon and T. Gärtner, "Expressivity versus efficiency of graph kernels," in Proceedings of European Conference on Machine Learning (ECML/PKDD), Cavtat-Dubrovnik, Croatia, Sep 22-23 2003, pp. 65–74.
[56] P. Mahé and J. Vert, "Graph kernels based on tree patterns for molecules," Mach. Learn., vol. 75, no. 1, pp. 3–35, 2009.
[57] T. Gärtner, P. A. Flach, and S. Wrobel, "On graph kernels: Hardness results and efficient alternatives." in COLT, vol. 2777, 2003, pp. 129–143.
[58] R. A. Rossi, N. K. Ahmed, E. Koh, S. Kim, A. Rao, and Y. Abbasi-Yadkori, "A structural graph representation learning framework," in International Conference on Web Search and Data Mining (WSDM), Houston, TX, USA, 2020, pp. 483–491.
[59] X. Li, W. Wei, X. Feng, X. Liu, and Z. Zheng, "Representation learning of graphs using graph convolutional multilayer networks based on motifs," Neurocomputing, vol. 464, pp. 218–226, 2021.
[60] G. Abuoda, G. D. F. Morales, and A. Aboulnaga, "Link prediction via higher-order motif features," in Proceedings of Machine Learning and Knowledge Discovery in Databases - European Conference (ECML) (PKDD), vol. 11906, Würzburg, Germany, Sep 16-20 2019, pp. 412–429.
[61] B. D. McKay and A. Piperno, "Practical graph isomorphism, II," J. Symb. Comput., vol. 60, pp. 94–112, 2014.
[62] P. Velickovic, "Theoretical foundations of graph neural networks," CST Wednesday Seminar, Feb 2021.
[63] M. Zaheer, S. Kottur, S. Ravanbakhsh, B. Poczos, R. R. Salakhutdinov, and A. J. Smola, "Deep Sets," in Advances in Neural Information Processing Systems (NIPS), 2017, pp. 3391–3401.
[64] E. Wagstaff, F. Fuchs, M. Engelcke, I. Posner, and M. A. Osborne, "On the limitations of representing functions on sets," in Proceedings of International Conference on Machine Learning (ICML), vol. 97, 2019, 9-15 June 2019, Long Beach, California, USA, 2019, pp. 6487–6494.
[65] H. Maron, H. Ben-Hamu, N. Shamir, and Y. Lipman, "Invariant and equivariant graph networks," in International Conference on Learning Representations (ICLR), New Orleans, LA, USA, May 6-9, 2019.
[66] H. Maron, E. Fetaya, N. Segol, and Y. Lipman, "On the universality of invariant networks," in Proceedings of International Conference on Machine Learning (ICML), vol. 97, Long Beach, California, USA, June, 9-15 2019, pp. 4363–4371.
[67] N. Keriven and G. Peyré, "Universal invariant and equivariant graph neural networks," in Advances in Neural Information Processing Systems (NIPS),
vol. 32, 2019.
[68] W. Hamilton, Z. Ying, and J. Leskovec, "Inductive Representation Learning on Large Graphs," in Advances in Neural Information Processing Systems
(NIPS), 2017, pp. 1024–1034.
[69] K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How Powerful are Graph Neural Networks?" in the 7th International Conference on Learning Representations (ICLR), USA, May 6-9 2019.
[70] J. Lee, Y. Lee, J. Kim, A. Kosiorek, S. Choi, and Y. W. Teh, "Set transformer: A framework for attentionbased permutation-invariant neural networks," in Proceedings of International Conference on Machine Learning (ICML), vol. 97, Jun, 09-15 2019, pp. 3744–3753.
[71] H. Zhu, F. Feng, X. He, X. Wang, Y. Li, K. Zheng, and Y. Zhang, "Bilinear graph neural network with neighbor interactions," in Proceedings of the TwentyNinth International Joint Conference on Artificial Intelligence, (IJCAI), 2020, pp. 1452–1458.
[72] S. Verma and Z.-L. Zhang, "Graph Capsule Convolutional Neural Networks," arXiv, 2018.
[73] P. Meltzer, M. D. G. Mallea, and P. J. Bentley, "PiNet: A Permutation Invariant Graph Neural Network for Graph Classification," CoRR, 2019.
[74] M. Rupp, A. Tkatchenko, K.-R. Müller, and O. A. von Lilienfeld, "Fast and accurate modeling of molecular atomization energies with machine learning," Physical Review Letters, vol. 108, p. 058301, 2012.
[75] G. Montavon, K. Hansen, S. Fazli, M. Rupp, F. Biegler, A. Ziehe, A. Tkatchenko, A. Lilienfeld, and K.R. Müller, "Learning invariant representations of molecules for atomization energy prediction," in Advances in Neural Information Processing Systems (NIPS), vol. 25, 2012.
[76] Y. Ma, S. Wang, C. C. Aggarwal, and J. Tang, "Graph convolutional networks with eigenpooling," in Proceedings of International Conference on Knowledge Discovery & Data Mining, (KDD), Anchorage, AK, USA, August, 4-8 2019, pp. 723–731.
[77] K. Zhao, Y. Rong, J. X. Yu, J. Huang, and H. Zhang, "Graph ordering: Towards the optimal by learning," CoRR, 2020.
[78] Z. Chen, L. Chen, S. Villar, and J. Bruna, "Can graph neural networks count substructures?" in Advances in Neural Information Processing Systems (NeurIPS) 33, December 6-12 2020.
[79] G. Bouritsas, F. Frasca, S. Zafeiriou, and M. M. Bronstein, "Improving graph neural network expressivity via subgraph isomorphism counting," CoRR, 2020.
[80] D. Floros, N. Pitsianis, and X. Sun, "Fast graphlet transform of sparse graphs," in 2020 IEEE High Performance Extreme Computing Conference, (HPEC), Waltham, MA, USA, Sep 22-24 2020, pp. 1–8.
[81] J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, "Neural message passing for quantum chemistry," in Proceedings of International Conference on Machine Learning, ICML, vol. 70, Sydney, NSW, Australia, August, 6-11 2017, pp. 1263–1272.
[82] V. Arvind, F. Fuhlbrück, J. Köbler, and O. Verbitsky, "On Weisfeiler-Leman invariance: Subgraph counts and related graph properties," in Proceedings of Fundamentals of Computation Theory (FCT), vol. 11651, Copenhagen, Denmark, August 12-14 2019, pp. 111–125.
[83] R. L. Murphy, B. Srinivasan, V. A. Rao, and B. Ribeiro, "Janossy pooling: Learning deep permutationinvariant functions for variable-size inputs," in International Conference on Learning Representations (ICLR), New Orleans, LA, USA, May, 6-9 2019.
[84] B. Bloem-Reddy and Y. W. Teh, "Probabilistic symmetries and invariant neural networks," J. Mach. Learn. Res., vol. 21, pp. 90:1–90:61, 2020.
[85] R. Murphy, B. Srinivasan, V. Rao, and B. Ribeiro, "Relational pooling for graph representations," in Proceedings of International Conference on Machine Learning (ICML), vol. 97, 09–15 Jun 2019, pp. 4663–4673.
[86] E. Cohen-Karlik, A. B. David, and A. Globerson, "Regularizing towards permutation invariance in recurrent models," in Advances in Neural Information Processing Systems (NIPS), Dec, 6-12 2020.
[87] J. Moore and J. Neville, "Deep collective inference," in Proceedings of Conference on Artificial Intelligence (AAAI), San Francisco, California, USA, Feb, 4-9 2017, pp. 2364–2372.