Simultaneous adaptation to the margin and to complexity in classification
From MaRDI portal
Publication:2456017
DOI10.1214/009053607000000055zbMath1209.62146arXivmath/0509696OpenAlexW1981292117MaRDI QIDQ2456017
Publication date: 17 October 2007
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0509696
classificationaggregationstatistical learningSVMmarginfast rates of convergenceexcess riskcomplexity of classes of sets
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Pattern recognition, speech recognition (68T10)
Related Items (17)
Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule ⋮ Fast learning rates in statistical inference through aggregation ⋮ Ordered smoothers with exponential weighting ⋮ PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting ⋮ Statistical performance of support vector machines ⋮ Risk bounds for CART classifiers under a margin condition ⋮ On the optimality of the aggregate with exponential weights for low temperatures ⋮ Kullback-Leibler aggregation and misspecified generalized linear models ⋮ Margin-adaptive model selection in statistical learning ⋮ Concentration inequalities for the exponential weighting method ⋮ Optimal rates of aggregation in classification under low noise assumption ⋮ Sparse estimation by exponential weighting ⋮ A universal procedure for aggregating estimators ⋮ Bandwidth selection in kernel empirical risk minimization via the gradient ⋮ Set structured global empirical risk minimizers are rate optimal in general dimensions ⋮ Sharp oracle inequalities for low-complexity priors ⋮ Sampling from non-smooth distributions through Langevin diffusion
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimax theory of image reconstruction
- Risk bounds for statistical learning
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Fast rates for support vector machines using Gaussian kernels
- Fast learning rates for plug-in classifiers
- A decision-theoretic generalization of on-line learning and an application to boosting
- Combining different procedures for adaptive regression
- Smooth discrimination analysis
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Mixing strategies for density estimation.
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Functional aggregation for nonparametric regression.
- Analyzing bagging
- Complexity regularization via localized random penalties
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Asymptotical minimax recovery of sets with smooth boundaries
- Support-vector networks
- Metric entropy of some classes of sets with differentiable boundaries
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Statistical performance of support vector machines
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Aggregation for Gaussian regression
- Model selection via testing: an alternative to (penalized) maximum likelihood estimators.
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Theory of Classification: a Survey of Some Recent Advances
- Information Theory and Mixing Least-Squares Regressions
- Sequential Procedures for Aggregating Arbitrary Estimators of a Conditional Mean
- Model Selection: An Integral Part of Inference
- Rademacher penalties and structural risk minimization
- 10.1162/1532443041424319
- Learning Theory and Kernel Machines
- Prediction, Learning, and Games
- Learning Theory
- Convexity, Classification, and Risk Bounds
- Some applications of concentration inequalities to statistics
This page was built for publication: Simultaneous adaptation to the margin and to complexity in classification