Simultaneous adaptation to the margin and to complexity in classification
DOI10.1214/009053607000000055zbMATH Open1209.62146arXivmath/0509696OpenAlexW1981292117MaRDI QIDQ2456017FDOQ2456017
Publication date: 17 October 2007
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0509696
Recommendations
- Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition
- Optimal aggregation of classifiers in statistical learning.
- Margin-adaptive model selection in statistical learning
- Optimal rates of aggregation in classification under low noise assumption
- Square root penalty: Adaption to the margin in classification and in edge estimation
aggregationclassificationstatistical learningSVMmarginfast rates of convergenceexcess riskcomplexity of classes of sets
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Pattern recognition, speech recognition (68T10)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Asymptotical minimax recovery of sets with smooth boundaries
- Support-vector networks
- Title not available (Why is that?)
- Learning Theory and Kernel Machines
- Prediction, Learning, and Games
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An introduction to support vector machines and other kernel-based learning methods.
- Analyzing bagging
- Model Selection: An Integral Part of Inference
- Title not available (Why is that?)
- Combining different procedures for adaptive regression
- Smooth discrimination analysis
- Mixing strategies for density estimation.
- Functional aggregation for nonparametric regression.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Aggregation for Gaussian regression
- Model selection via testing: an alternative to (penalized) maximum likelihood estimators.
- Information Theory and Mixing Least-Squares Regressions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Some applications of concentration inequalities to statistics
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Convexity, Classification, and Risk Bounds
- Minimax theory of image reconstruction
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Optimal aggregation of classifiers in statistical learning.
- Rademacher penalties and structural risk minimization
- Fast learning rates for plug-in classifiers
- Theory of Classification: a Survey of Some Recent Advances
- Learning Theory
- Fast rates for support vector machines using Gaussian kernels
- Statistical performance of support vector machines
- Title not available (Why is that?)
- Risk bounds for statistical learning
- Square root penalty: Adaption to the margin in classification and in edge estimation
- On the Bayes-risk consistency of regularized boosting methods.
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Sequential Procedures for Aggregating Arbitrary Estimators of a Conditional Mean
- Complexity regularization via localized random penalties
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- 10.1162/1532443041424319
- Metric entropy of some classes of sets with differentiable boundaries
Cited In (18)
- A universal procedure for aggregating estimators
- Fast learning rates in statistical inference through aggregation
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Risk bounds for CART classifiers under a margin condition
- Margin-adaptive model selection in statistical learning
- Sharp oracle inequalities for low-complexity priors
- Concentration inequalities for the exponential weighting method
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- On the optimality of the aggregate with exponential weights for low temperatures
- Kullback-Leibler aggregation and misspecified generalized linear models
- Bandwidth selection in kernel empirical risk minimization via the gradient
- Sampling from non-smooth distributions through Langevin diffusion
- Statistical performance of support vector machines
- Ordered smoothers with exponential weighting
- Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule
- PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting
- Optimal rates of aggregation in classification under low noise assumption
- Sparse estimation by exponential weighting
Uses Software
This page was built for publication: Simultaneous adaptation to the margin and to complexity in classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2456017)