Bounding sample size with the Vapnik-Chervonenkis dimension
From MaRDI portal
(Redirected from Publication:1209149)
Recommendations
- Learning faster than promised by the Vapnik-Chervonenkis dimension
- Results on learnability and the Vapnik-Chervonenkis dimension
- Learnability and the Vapnik-Chervonenkis dimension
- A result of Vapnik with applications
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
Cites work
- A theory of the learnable
- Computational limitations on learning from examples
- Learnability and the Vapnik-Chervonenkis dimension
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Quantifying inductive bias: AI learning algorithms and Valiant's learning framework
- -nets and simplex range queries
Cited in
(11)- PAC-learning from general examples
- Learning faster than promised by the Vapnik-Chervonenkis dimension
- Using the doubling dimension to analyze the generalization of learning algorithms
- Learning with side information: PAC learning bounds
- An approach to guided learning of Boolean functions
- The optimal sample complexity of PAC learning
- A generalization of Sauer's lemma
- Combinatorics and connectionism
- A result of Vapnik with applications
- Improved bounds on the sample complexity of learning
- Valid Generalisation from Approximate Interpolation
This page was built for publication: Bounding sample size with the Vapnik-Chervonenkis dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1209149)