Theory of graph neural networks: representation and learning
From MaRDI portal
Publication:6200219
Learning and adaptive systems in artificial intelligence (68T05) Applications of graph theory (05C90) Artificial neural networks and deep learning (68T07) Computational learning theory (68Q32) Graph theory (including graph drawing) in computer science (68R10) Isomorphism problems in graph theory (reconstruction conjecture, etc.) and homomorphisms (subgraph embedding, etc.) (05C60) Graph representations (geometric and intersection representations, etc.) (05C62) Approximations and expansions (41A99)
Abstract: Graph Neural Networks (GNNs), neural network architectures targeted to learning representations of graphs, have become a popular learning model for prediction tasks on nodes, graphs and configurations of points, with wide success in practice. This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNNs, focusing on representation, generalization and extrapolation. Along the way, it summarizes mathematical connections.
Recommendations
Cites work
- scientific article; zbMATH DE number 3141308 (Why is no real title available?)
- scientific article; zbMATH DE number 7626787 (Why is no real title available?)
- 10.1162/153244303321897690
- A congruence theorem for trees
- An optimal lower bound on the number of variables for graph identification
- Approximation by superpositions of a sigmoidal function
- Distributed Computing: A Locality-Sensitive Approach
- Graph limits and exchangeable random graphs
- Locality in Distributed Graph Algorithms
- On a routing problem
- On construction and identification of graphs. With contributions by A. Lehman, G. M. Adelson-Velsky, V. Arlazarov, I. Faragev, A. Uskov, I. Zuev, M. Rosenfeld and B. Weisfeiler
- On tables of random numbers
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the power of color refinement
- PEBBLE GAMES AND LINEAR EQUATIONS
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Random Graph Isomorphism
- Representations for partially exchangeable arrays of random variables
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- The graph isomorphism disease
- Weak models of distributed computing, with connections to modal logic
Cited in
(14)- Lower and upper bounds for numbers of linear regions of graph convolutional networks
- A comprehensive survey on deep graph representation learning methods
- Learning stability on graphs
- The logic of graph neural networks
- A charge-preserving method for solving graph neural diffusion networks
- Geometric deep learning: a temperature based analysis of graph neural networks
- Graph rewriting for graph neural networks
- Weisfeiler-Lehman goes dynamic: an analysis of the expressive power of graph neural networks for attributed and dynamic graphs
- Revisiting graph neural networks from hybrid regularized graph signal reconstruction
- Polarized message-passing in graph neural networks
- Geometric deep learning for design of catalysts and molecules
- Principles for initialization and architecture selection in graph neural networks with ReLU activations
- GPNet: simplifying graph neural networks via multi-channel geometric polynomials
- A gaze into the internal logic of graph neural networks, with logic
This page was built for publication: Theory of graph neural networks: representation and learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6200219)