Theory of graph neural networks: representation and learning
DOI10.4171/ICM2022/162arXiv2204.07697OpenAlexW4389775256MaRDI QIDQ6200219FDOQ6200219
Authors: Stefanie Jegelka
Publication date: 22 March 2024
Published in: International Congress of Mathematicians (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2204.07697
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Applications of graph theory (05C90) Artificial neural networks and deep learning (68T07) Computational learning theory (68Q32) Graph theory (including graph drawing) in computer science (68R10) Isomorphism problems in graph theory (reconstruction conjecture, etc.) and homomorphisms (subgraph embedding, etc.) (05C60) Graph representations (geometric and intersection representations, etc.) (05C62) Approximations and expansions (41A99)
Cites Work
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On a routing problem
- Representations for partially exchangeable arrays of random variables
- Graph limits and exchangeable random graphs
- Title not available (Why is that?)
- 10.1162/153244303321897690
- Distributed Computing: A Locality-Sensitive Approach
- Approximation by superpositions of a sigmoidal function
- Random Graph Isomorphism
- The graph isomorphism disease
- A congruence theorem for trees
- Locality in Distributed Graph Algorithms
- An optimal lower bound on the number of variables for graph identification
- On construction and identification of graphs. With contributions by A. Lehman, G. M. Adelson-Velsky, V. Arlazarov, I. Faragev, A. Uskov, I. Zuev, M. Rosenfeld and B. Weisfeiler
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- On tables of random numbers
- PEBBLE GAMES AND LINEAR EQUATIONS
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- Weak models of distributed computing, with connections to modal logic
- On the power of color refinement
- Title not available (Why is that?)
Cited In (14)
- Polarized message-passing in graph neural networks
- Lower and upper bounds for numbers of linear regions of graph convolutional networks
- Revisiting graph neural networks from hybrid regularized graph signal reconstruction
- A gaze into the internal logic of graph neural networks, with logic
- Geometric deep learning: a temperature based analysis of graph neural networks
- A charge-preserving method for solving graph neural diffusion networks
- Learning stability on graphs
- The logic of graph neural networks
- Geometric deep learning for design of catalysts and molecules
- Principles for initialization and architecture selection in graph neural networks with ReLU activations
- GPNet: simplifying graph neural networks via multi-channel geometric polynomials
- Graph rewriting for graph neural networks
- Weisfeiler-Lehman goes dynamic: an analysis of the expressive power of graph neural networks for attributed and dynamic graphs
- A comprehensive survey on deep graph representation learning methods
This page was built for publication: Theory of graph neural networks: representation and learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6200219)