Using the doubling dimension to analyze the generalization of learning algorithms
From MaRDI portal
Publication:923877
DOI10.1016/j.jcss.2009.01.003zbMath1175.68315OpenAlexW2095439633MaRDI QIDQ923877
Philip M. Long, Yi Li, Nader H. Bshouty
Publication date: 24 July 2009
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jcss.2009.01.003
learning theorylocal complexitygeneralizationPAC learningstatistical learning theorydoubling dimensiondoubling metric
Related Items (7)
Localization of VC classes: beyond local Rademacher complexities ⋮ VC bounds on the cardinality of nearly orthogonal function classes ⋮ Active Nearest-Neighbor Learning in Metric Spaces ⋮ Two proofs for shallow packings ⋮ When are epsilon-nets small? ⋮ Non-uniform packings ⋮ On the Impossibility of Dimension Reduction for Doubling Subsets of $\ell_{p}$
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learnability with respect to fixed distributions
- An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Bounding sample size with the Vapnik-Chervonenkis dimension
- Central limit theorems for empirical measures
- Prediction, learning, uniform convergence, and scale-sensitive dimensions
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Lectures on analysis on metric spaces
- A general lower bound on the number of examples needed for learning
- Apple tasting.
- Weak convergence and empirical processes. With applications to statistics
- Local Rademacher complexities
- Triangulation and embedding using small sets of beacons
- Learnability and the Vapnik-Chervonenkis dimension
- Information-theoretic upper and lower bounds for statistical estimation
- Bypassing the embedding
- Plongements lipschitziens dans ${\bbfR}\sp n$
- 10.1162/1532443041424337
- Neural Network Learning
- Distance estimation and object location via rings of neighbors
- On the locality of bounded growth
- Automata, Languages and Programming
- Fast Construction of Nets in Low-Dimensional Metrics and Their Applications
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
This page was built for publication: Using the doubling dimension to analyze the generalization of learning algorithms