Vapnik-Chervonenkis dimension of recurrent neural networks
From MaRDI portal
Publication:1265745
DOI10.1016/S0166-218X(98)00014-6zbMATH Open0907.68156OpenAlexW2053955420MaRDI QIDQ1265745FDOQ1265745
Authors: Pascal Koiran, Eduardo D. Sontag
Publication date: 15 December 1998
Published in: Discrete Applied Mathematics (Search for Journal in Brave)
Full work available at URL: http://www.elsevier.com/locate/dam
Recommendations
Cites Work
- Learnability and the Vapnik-Chervonenkis dimension
- Neural networks and physical systems with emergent collective computational abilities
- Title not available (Why is that?)
- Feedforward nets for interpolation and classification
- On the computational power of neural nets
- Analog computation via neural networks
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural networks with quadratic VC dimension
- Sample complexity for learning recurrent perceptron mappings
Cited In (11)
- Title not available (Why is that?)
- A learning result for continuous-time recurrent neural networks
- Complete controllability of continuous-time recurrent neural networks
- Title not available (Why is that?)
- Title not available (Why is that?)
- Neural Networks with Local Receptive Fields and Superlinear VC Dimension
- Neural Nets with Superlinear VC-Dimension
- Compressive sensing and neural networks from a statistical learning perspective
- On the complexity of computing and learning with multiplicative neural networks
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- On the sample complexity for nonoverlapping neural networks
This page was built for publication: Vapnik-Chervonenkis dimension of recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1265745)