Square root penalty: Adaption to the margin in classification and in edge estimation
From MaRDI portal
Publication:2569239
DOI10.1214/009053604000001066zbMath1080.62047arXivmath/0507422OpenAlexW3099719085MaRDI QIDQ2569239
Sara van de Geer, Alexandre B. Tsybakov
Publication date: 18 October 2005
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0507422
Nonparametric regression and quantile regression (62G08) Density estimation (62G07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Pattern recognition, speech recognition (68T10)
Related Items (19)
Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder) ⋮ Classifiers of support vector machine type with \(\ell_1\) complexity regularization ⋮ Inverse statistical learning ⋮ Optimal rates for plug-in estimators of density level sets ⋮ Fast convergence rates of deep neural networks for classification ⋮ Statistical performance of support vector machines ⋮ Risk bounds for CART classifiers under a margin condition ⋮ Classification with minimax fast rates for classes of Bayes rules with sparse representation ⋮ The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods ⋮ Margin-adaptive model selection in statistical learning ⋮ A systematic review on model selection in high-dimensional regression ⋮ Simultaneous adaptation to the margin and to complexity in classification ⋮ Optimal exponential bounds on the accuracy of classification ⋮ Estimation of high-dimensional low-rank matrices ⋮ A multivariate adaptive stochastic search method for dimensionality reduction in classification ⋮ Fast learning rates for plug-in classifiers ⋮ Theory of Classification: a Survey of Some Recent Advances ⋮ Relaxed Lasso ⋮ Minimax fast rates for discriminant analysis with errors in variables
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimax theory of image reconstruction
- Risk bounds for model selection via penalization
- Wavelets, approximation, and statistical applications
- Smooth discrimination analysis
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Penalized blockwise Stein's method, monotone oracles and sharp adaptive estimation
- Complexity regularization via localized random penalties
- Optimal aggregation of classifiers in statistical learning.
- Aggregated estimators and empirical complexity for least square regression
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Adaptive estimation with soft thresholding penalties
- Rademacher penalties and structural risk minimization
- 10.1162/1532443041424319
- Convexity, Classification, and Risk Bounds
This page was built for publication: Square root penalty: Adaption to the margin in classification and in edge estimation