Simultaneous adaptation to the margin and to complexity in classification
From MaRDI portal
(Redirected from Publication:2456017)
Abstract: We consider the problem of adaptation to the margin and to complexity in binary classification. We suggest an exponential weighting aggregation scheme. We use this aggregation procedure to construct classifiers which adapt automatically to margin and complexity. Two main examples are worked out in which adaptivity is achieved in frameworks proposed by Steinwart and Scovel [Learning Theory. Lecture Notes in Comput. Sci. 3559 (2005) 279--294. Springer, Berlin; Ann. Statist. 35 (2007) 575--607] and Tsybakov [Ann. Statist. 32 (2004) 135--166]. Adaptive schemes, like ERM or penalized ERM, usually involve a minimization step. This is not the case for our procedure.
Recommendations
- Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition
- Optimal aggregation of classifiers in statistical learning.
- Margin-adaptive model selection in statistical learning
- Optimal rates of aggregation in classification under low noise assumption
- Square root penalty: Adaption to the margin in classification and in edge estimation
Cites work
- scientific article; zbMATH DE number 1950576 (Why is no real title available?)
- scientific article; zbMATH DE number 1522808 (Why is no real title available?)
- scientific article; zbMATH DE number 1552503 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- scientific article; zbMATH DE number 1420699 (Why is no real title available?)
- 10.1162/1532443041424319
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Aggregation for Gaussian regression
- An introduction to support vector machines and other kernel-based learning methods.
- Analyzing bagging
- Asymptotical minimax recovery of sets with smooth boundaries
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Combining different procedures for adaptive regression
- Complexity regularization via localized random penalties
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Convexity, Classification, and Risk Bounds
- Fast learning rates for plug-in classifiers
- Fast rates for support vector machines using Gaussian kernels
- Functional aggregation for nonparametric regression.
- Information Theory and Mixing Least-Squares Regressions
- Learning Theory
- Learning Theory and Kernel Machines
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Metric entropy of some classes of sets with differentiable boundaries
- Minimax theory of image reconstruction
- Mixing strategies for density estimation.
- Model Selection: An Integral Part of Inference
- Model selection via testing: an alternative to (penalized) maximum likelihood estimators.
- On the Bayes-risk consistency of regularized boosting methods.
- Optimal aggregation of classifiers in statistical learning.
- Prediction, Learning, and Games
- Rademacher penalties and structural risk minimization
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Risk bounds for statistical learning
- Sequential Procedures for Aggregating Arbitrary Estimators of a Conditional Mean
- Smooth discrimination analysis
- Some applications of concentration inequalities to statistics
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Statistical performance of support vector machines
- Support-vector networks
- Theory of Classification: a Survey of Some Recent Advances
Cited in
(19)- Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition
- Statistical performance of support vector machines
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Risk bounds for CART classifiers under a margin condition
- Bandwidth selection in kernel empirical risk minimization via the gradient
- On the optimality of the aggregate with exponential weights for low temperatures
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- Fast learning rates in statistical inference through aggregation
- Margin-adaptive model selection in statistical learning
- Sampling from non-smooth distributions through Langevin diffusion
- Sharp oracle inequalities for low-complexity priors
- Ordered smoothers with exponential weighting
- Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule
- Kullback-Leibler aggregation and misspecified generalized linear models
- A universal procedure for aggregating estimators
- Sparse estimation by exponential weighting
- PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting
- Optimal rates of aggregation in classification under low noise assumption
- Concentration inequalities for the exponential weighting method
This page was built for publication: Simultaneous adaptation to the margin and to complexity in classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2456017)