Scale-sensitive dimensions and skeleton estimates for classification
From MaRDI portal
Publication:1265744
DOI10.1016/S0166-218X(98)00013-4zbMath0934.62065OpenAlexW1988355994MaRDI QIDQ1265744
Publication date: 27 April 2000
Published in: Discrete Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0166-218x(98)00013-4
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- A result of Vapnik with applications
- Efficient distribution-free learning of probabilistic concepts
- Adaptive model selection using empirical complexities
- Fat-shattering and the learnability of real-valued functions
- On the density of families of sets
- Learnability and the Vapnik-Chervonenkis dimension
- Efficient agnostic learning of neural networks with bounded fan-in
- Covering numbers for real-valued function classes
- Scale-sensitive dimensions, uniform convergence, and learnability
- Function Learning from Interpolation
- A metric entropy bound is not sufficient for learnability
- Learning by canonical smooth estimation. I. Simultaneous estimation
- Learning by canonical smooth estimation. II. Learning and choice of model complexity
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convergence of stochastic processes