Complexities of convex combinations and bounding the generalization error in classification
DOI10.1214/009053605000000228zbMath1080.62045arXivmath/0405356OpenAlexW1980526554WikidataQ105584231 ScholiaQ105584231MaRDI QIDQ2583410
Vladimir I. Koltchinskii, Dmitriy Panchenko
Publication date: 16 January 2006
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0405356
Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Computational learning theory (68Q32) Analysis of algorithms and problem complexity (68Q25) Nonparametric estimation (62G05) Strong limit theorems (60F15) Artificial intelligence (68T99) Complexity and performance of numerical algorithms (65Y20)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Bounds on margin distributions in learning problems
- Symmetrization approach to concentration inequalities for empirical processes.
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Some extensions of an inequality of Vapnik and Chervonenkis
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Aggregated estimators and empirical complexity for least square regression
- Weak convergence and empirical processes. With applications to statistics
- Improved boosting algorithms using confidence-rated predictions
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- Boosting with early stopping: convergence and consistency
- 10.1162/153244303768966111
- Uniform Central Limit Theorems
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- 10.1162/1532443041424319
- 10.1162/1532443041827925
- Learning Theory and Kernel Machines
- Learning Theory and Kernel Machines
- Convexity, Classification, and Risk Bounds
- Some applications of concentration inequalities to statistics
- A note on Talagrand's concentration inequality