The Vapnik-Chervonenkis dimension of graph and recursive neural networks
From MaRDI portal
Publication:2182896
DOI10.1016/j.neunet.2018.08.010zbMath1434.68524OpenAlexW2889399096WikidataQ91467455 ScholiaQ91467455MaRDI QIDQ2182896
Markus Hagenbuchner, Franco Scarselli, Ah Chung Tsoi
Publication date: 26 May 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2018.08.010
Related Items (3)
Stability and generalization of graph convolutional networks in eigen-domains ⋮ On the Explainability of Graph Convolutional Network With GCN Tangent Kernel ⋮ Theory of graph neural networks: representation and learning
Cites Work
- Supervised sequence labelling with recurrent neural networks.
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural networks with quadratic VC dimension
- Kernels and distances for structured data
- Complexity of stratifications of semi-Pfaffian sets
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- 10.1162/153244303321897690
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the Betti Numbers of Real Varieties
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: The Vapnik-Chervonenkis dimension of graph and recursive neural networks