Neural networks with quadratic VC dimension
From MaRDI portal
Publication:676433
DOI10.1006/jcss.1997.1479zbMath0869.68089OpenAlexW2050946712MaRDI QIDQ676433
Pascal Koiran, Eduardo D. Sontag
Publication date: 18 March 1997
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/1667592ce31f3888be45ee6d6695919439c0e5eb
Related Items
Descartes' Rule of Signs for Radial Basis Function Neural Networks, The Vapnik-Chervonenkis dimension of graph and recursive neural networks, Neural Networks with Local Receptive Fields and Superlinear VC Dimension, On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders, On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach, On the Complexity of Computing and Learning with Multiplicative Neural Networks, Theory of Classification: a Survey of Some Recent Advances, Vapnik-Chervonenkis dimension of recurrent neural networks, On the complexity of learning for spiking neurons with temporal coding.
Cites Work
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Feedforward nets for interpolation and classification
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Probably Approximate Learning of Sets and Functions
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Scale-sensitive dimensions, uniform convergence, and learnability
- On a theory of computation and complexity over the real numbers: 𝑁𝑃- completeness, recursive functions and universal machines
- Bounds for the computational power and learning complexity of analog neural nets
- Unnamed Item
- Unnamed Item
- Unnamed Item