Margin-adaptive model selection in statistical learning
From MaRDI portal
Publication:453298
DOI10.3150/10-BEJ288zbMath1345.62087arXiv0804.2937MaRDI QIDQ453298
Sylvain Arlot, Bartlett, Peter L.
Publication date: 19 September 2012
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0804.2937
model selectionadaptivityoracle inequalitiesstatistical learningempirical risk minimizationempirical minimizationlocal Rademacher complexitymargin condition
Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Related Items
Risk bounds for CART classifiers under a margin condition, Optimal linear discriminators for the discrete choice model in growing dimensions
Cites Work
- Unnamed Item
- Risk bounds for statistical learning
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Fast learning rates for plug-in classifiers
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Risk bounds for model selection via penalization
- Smooth discrimination analysis
- Complexity regularization via localized random penalties
- Optimal aggregation of classifiers in statistical learning.
- Model selection by resampling penalization
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Simultaneous adaptation to the margin and to complexity in classification
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Local Rademacher complexities
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Learning Theory
- 10.1162/1532443041424319
- Suboptimality of Penalized Empirical Risk Minimization in Classification
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convexity, Classification, and Risk Bounds