Square root penalty: Adaption to the margin in classification and in edge estimation
From MaRDI portal
Publication:2569239
Abstract: We consider the problem of adaptation to the margin in binary classification. We suggest a penalized empirical risk minimization classifier that adaptively attains, up to a logarithmic factor, fast optimal rates of convergence for the excess risk, that is, rates that can be faster than n^{-1/2}, where n is the sample size. We show that our method also gives adaptive estimators for the problem of edge estimation.
Recommendations
Cites work
- scientific article; zbMATH DE number 5654889 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 477682 (Why is no real title available?)
- scientific article; zbMATH DE number 1950576 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- 10.1162/1532443041424319
- Adaptive estimation with soft thresholding penalties
- Aggregated estimators and empirical complexity for least square regression
- Complexity regularization via localized random penalties
- Convexity, Classification, and Risk Bounds
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Minimax theory of image reconstruction
- Optimal aggregation of classifiers in statistical learning.
- Penalized blockwise Stein's method, monotone oracles and sharp adaptive estimation
- Rademacher penalties and structural risk minimization
- Risk bounds for model selection via penalization
- Smooth discrimination analysis
- Wavelets, approximation, and statistical applications
Cited in
(21)- Classification with minimax fast rates for classes of Bayes rules with sparse representation
- Optimal rates for plug-in estimators of density level sets
- Statistical performance of support vector machines
- A survey on Neyman-Pearson classification and suggestions for future research
- Risk bounds for CART classifiers under a margin condition
- Fast convergence rates of deep neural networks for classification
- Estimation of high-dimensional low-rank matrices
- Margin-adaptive model selection in statistical learning
- A systematic review on model selection in high-dimensional regression
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- A multivariate adaptive stochastic search method for dimensionality reduction in classification
- Fast learning rates for plug-in classifiers
- Relaxed Lasso
- Inverse statistical learning
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Simultaneous adaptation to the margin and to complexity in classification
- Minimax fast rates for discriminant analysis with errors in variables
- On regression and classification with possibly missing response variables in the data
- Optimal exponential bounds on the accuracy of classification
- Theory of Classification: a Survey of Some Recent Advances
This page was built for publication: Square root penalty: Adaption to the margin in classification and in edge estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2569239)