Neural networks with quadratic VC dimension
From MaRDI portal
Publication:676433
DOI10.1006/JCSS.1997.1479zbMATH Open0869.68089OpenAlexW2050946712MaRDI QIDQ676433FDOQ676433
Authors: Pascal Koiran, Eduardo D. Sontag
Publication date: 18 March 1997
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/1667592ce31f3888be45ee6d6695919439c0e5eb
Recommendations
Cites Work
- Learnability and the Vapnik-Chervonenkis dimension
- On a theory of computation and complexity over the real numbers: 𝑁𝑃- completeness, recursive functions and universal machines
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- A theory of the learnable
- Feedforward nets for interpolation and classification
- Probably Approximate Learning of Sets and Functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Scale-sensitive dimensions, uniform convergence, and learnability
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Title not available (Why is that?)
- Bounds for the computational power and learning complexity of analog neural nets
Cited In (18)
- On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach
- Title not available (Why is that?)
- Relation between weight size and degree of over-fitting in neural network regression
- Vapnik-Chervonenkis dimension of recurrent neural networks
- On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders
- Descartes' Rule of Signs for Radial Basis Function Neural Networks
- On neural network design. I: Using the MVQ algorithm
- Neural Quadratic Discriminant Analysis: Nonlinear Decoding with V1-Like Computation
- On the complexity of learning for spiking neurons with temporal coding.
- Title not available (Why is that?)
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural Networks with Local Receptive Fields and Superlinear VC Dimension
- Title not available (Why is that?)
- Neural Nets with Superlinear VC-Dimension
- On the complexity of computing and learning with multiplicative neural networks
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- On the sample complexity for nonoverlapping neural networks
- Theory of Classification: a Survey of Some Recent Advances
This page was built for publication: Neural networks with quadratic VC dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q676433)