Understanding the message passing in graph neural networks via power iteration clustering
From MaRDI portal
Publication:6078751
Abstract: The mechanism of message passing in graph neural networks (GNNs) is still mysterious. Apart from convolutional neural networks, no theoretical origin for GNNs has been proposed. To our surprise, message passing can be best understood in terms of power iteration. By fully or partly removing activation functions and layer weights of GNNs, we propose subspace power iteration clustering (SPIC) models that iteratively learn with only one aggregator. Experiments show that our models extend GNNs and enhance their capability to process random featured networks. Moreover, we demonstrate the redundancy of some state-of-the-art GNNs in design and define a lower limit for model evaluation by a random aggregator of message passing. Our findings push the boundaries of the theoretical understanding of neural networks.
Recommendations
- Mean-field theory of graph neural networks in graph partitioning
- \(k\)-hop graph neural networks
- Fast Haar transforms for graph neural networks
- Pseudoinverse graph convolutional networks. Fast filters tailored for large eigengaps of dense graphs and hypergraphs
- Variational models for signal processing with graph neural networks
Cites work
Cited in
(13)- Interpreting the basis path set in neural networks
- \(k\)-hop graph neural networks
- Lower and upper bounds for numbers of linear regions of graph convolutional networks
- Learning stability on graphs
- Graph routing between capsules
- A charge-preserving method for solving graph neural diffusion networks
- Variational models for signal processing with graph neural networks
- Geometric deep learning: a temperature based analysis of graph neural networks
- Mean-field theory of graph neural networks in graph partitioning
- Theory of graph neural networks: representation and learning
- Revisiting graph neural networks from hybrid regularized graph signal reconstruction
- Polarized message-passing in graph neural networks
- Bosonic random walk neural networks for graph learning
This page was built for publication: Understanding the message passing in graph neural networks via power iteration clustering
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6078751)