Vapnik-Chervonenkis dimension of recurrent neural networks
From MaRDI portal
Publication:1265745
DOI10.1016/S0166-218X(98)00014-6zbMath0907.68156OpenAlexW2053955420MaRDI QIDQ1265745
Publication date: 15 December 1998
Published in: Discrete Applied Mathematics (Search for Journal in Brave)
Full work available at URL: http://www.elsevier.com/locate/dam
Related Items (5)
Complete controllability of continuous-time recurrent neural networks ⋮ On the Complexity of Computing and Learning with Multiplicative Neural Networks ⋮ A learning result for continuous-time recurrent neural networks ⋮ Unnamed Item ⋮ Compressive sensing and neural networks from a statistical learning perspective
Cites Work
- Unnamed Item
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural networks with quadratic VC dimension
- Feedforward nets for interpolation and classification
- Analog computation via neural networks
- On the computational power of neural nets
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Learnability and the Vapnik-Chervonenkis dimension
- Sample complexity for learning recurrent perceptron mappings
- Neural networks and physical systems with emergent collective computational abilities.
This page was built for publication: Vapnik-Chervonenkis dimension of recurrent neural networks