Neural Networks with Local Receptive Fields and Superlinear VC Dimension
From MaRDI portal
Publication:4330677
DOI10.1162/089976602317319018zbMath0993.68088OpenAlexW2147359466WikidataQ52043105 ScholiaQ52043105MaRDI QIDQ4330677
Publication date: 14 May 2002
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976602317319018
Related Items (2)
Descartes' Rule of Signs for Radial Basis Function Neural Networks ⋮ On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach
Cites Work
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural networks with quadratic VC dimension
- Some special Vapnik-Chervonenkis classes
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Balls in \(\mathbb{R}^k\) do not cut all subsets of \(k+2\) points
- Density and dimension
- Exact VC-dimension of Boolean monomials
- On the complexity of learning for spiking neurons with temporal coding.
- Sample sizes for threshold networks with equivalences
- Classification by polynomial surfaces
- Learnability and the Vapnik-Chervonenkis dimension
- Neural Nets with Superlinear VC-Dimension
- Enumeration of Seven-Argument Threshold Functions
This page was built for publication: Neural Networks with Local Receptive Fields and Superlinear VC Dimension