Bounding sample size with the Vapnik-Chervonenkis dimension
From MaRDI portal
Publication:1209149
DOI10.1016/0166-218X(93)90179-RzbMATH Open0784.68070MaRDI QIDQ1209149FDOQ1209149
Authors: John Shawe-Taylor, Martin Anthony, Norman L. Biggs
Publication date: 16 May 1993
Published in: Discrete Applied Mathematics (Search for Journal in Brave)
Recommendations
- Learning faster than promised by the Vapnik-Chervonenkis dimension
- Results on learnability and the Vapnik-Chervonenkis dimension
- Learnability and the Vapnik-Chervonenkis dimension
- A result of Vapnik with applications
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
Cites Work
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Learnability and the Vapnik-Chervonenkis dimension
- \(\epsilon\)-nets and simplex range queries
- A theory of the learnable
- Computational limitations on learning from examples
- Quantifying inductive bias: AI learning algorithms and Valiant's learning framework
Cited In (11)
- PAC-learning from general examples
- Learning faster than promised by the Vapnik-Chervonenkis dimension
- Using the doubling dimension to analyze the generalization of learning algorithms
- Learning with side information: PAC learning bounds
- An approach to guided learning of Boolean functions
- The optimal sample complexity of PAC learning
- A generalization of Sauer's lemma
- Combinatorics and connectionism
- Improved bounds on the sample complexity of learning
- A result of Vapnik with applications
- Valid Generalisation from Approximate Interpolation
This page was built for publication: Bounding sample size with the Vapnik-Chervonenkis dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1209149)