Scale-sensitive dimensions and skeleton estimates for classification
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 3738628 (Why is no real title available?)
- scientific article; zbMATH DE number 89080 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- A metric entropy bound is not sufficient for learnability
- A result of Vapnik with applications
- Adaptive model selection using empirical complexities
- Convergence of stochastic processes
- Covering numbers for real-valued function classes
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Efficient agnostic learning of neural networks with bounded fan-in
- Efficient distribution-free learning of probabilistic concepts
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Fat-shattering and the learnability of real-valued functions
- Function Learning from Interpolation
- Learnability and the Vapnik-Chervonenkis dimension
- Learning by canonical smooth estimation. I. Simultaneous estimation
- Learning by canonical smooth estimation. II. Learning and choice of model complexity
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the density of families of sets
- Scale-sensitive dimensions, uniform convergence, and learnability
This page was built for publication: Scale-sensitive dimensions and skeleton estimates for classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1265744)