Square root penalty: Adaption to the margin in classification and in edge estimation
DOI10.1214/009053604000001066zbMATH Open1080.62047arXivmath/0507422OpenAlexW3099719085MaRDI QIDQ2569239FDOQ2569239
Authors: Alexandre B. Tsybakov, Sara Van De Geer
Publication date: 18 October 2005
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0507422
Recommendations
Density estimation (62G07) Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Pattern recognition, speech recognition (68T10)
Cites Work
- Title not available (Why is that?)
- Risk bounds for model selection via penalization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Smooth discrimination analysis
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Convexity, Classification, and Risk Bounds
- Wavelets, approximation, and statistical applications
- Minimax theory of image reconstruction
- Optimal aggregation of classifiers in statistical learning.
- Rademacher penalties and structural risk minimization
- Adaptive estimation with soft thresholding penalties
- Complexity regularization via localized random penalties
- Aggregated estimators and empirical complexity for least square regression
- 10.1162/1532443041424319
- Penalized blockwise Stein's method, monotone oracles and sharp adaptive estimation
Cited In (21)
- A systematic review on model selection in high-dimensional regression
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- A multivariate adaptive stochastic search method for dimensionality reduction in classification
- On regression and classification with possibly missing response variables in the data
- Risk bounds for CART classifiers under a margin condition
- Margin-adaptive model selection in statistical learning
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Fast learning rates for plug-in classifiers
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Fast convergence rates of deep neural networks for classification
- Optimal exponential bounds on the accuracy of classification
- Inverse statistical learning
- Minimax fast rates for discriminant analysis with errors in variables
- Classification with minimax fast rates for classes of Bayes rules with sparse representation
- Simultaneous adaptation to the margin and to complexity in classification
- Statistical performance of support vector machines
- Optimal rates for plug-in estimators of density level sets
- Estimation of high-dimensional low-rank matrices
- A survey on Neyman-Pearson classification and suggestions for future research
- Theory of Classification: a Survey of Some Recent Advances
- Relaxed Lasso
This page was built for publication: Square root penalty: Adaption to the margin in classification and in edge estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2569239)