Vapnik-Chervonenkis dimension of recurrent neural networks
From MaRDI portal
(Redirected from Publication:1265745)
Recommendations
Cites work
- scientific article; zbMATH DE number 47820 (Why is no real title available?)
- Analog computation via neural networks
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Feedforward nets for interpolation and classification
- Learnability and the Vapnik-Chervonenkis dimension
- Neural networks and physical systems with emergent collective computational abilities
- Neural networks with quadratic VC dimension
- On the computational power of neural nets
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Sample complexity for learning recurrent perceptron mappings
Cited in
(11)- scientific article; zbMATH DE number 922640 (Why is no real title available?)
- A learning result for continuous-time recurrent neural networks
- Complete controllability of continuous-time recurrent neural networks
- scientific article; zbMATH DE number 7306919 (Why is no real title available?)
- scientific article; zbMATH DE number 1843097 (Why is no real title available?)
- Neural Networks with Local Receptive Fields and Superlinear VC Dimension
- Neural Nets with Superlinear VC-Dimension
- Compressive sensing and neural networks from a statistical learning perspective
- On the complexity of computing and learning with multiplicative neural networks
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- On the sample complexity for nonoverlapping neural networks
This page was built for publication: Vapnik-Chervonenkis dimension of recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1265745)