Optimal rates of aggregation in classification under low noise assumption
From MaRDI portal
Publication:2469663
DOI10.3150/07-BEJ6044zbMath1129.62060arXivmath/0603447MaRDI QIDQ2469663
Publication date: 6 February 2008
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0603447
Related Items (9)
Fast learning rates in statistical inference through aggregation ⋮ An adaptive multiclass nearest neighbor classifier ⋮ Oracle inequalities for cross-validation type procedures ⋮ Classification with minimax fast rates for classes of Bayes rules with sparse representation ⋮ General oracle inequalities for model selection ⋮ Mirror averaging with sparsity priors ⋮ Aggregation of affine estimators ⋮ Optimal learning with \textit{Q}-aggregation ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Risk bounds for statistical learning
- Learning by mirror averaging
- Fast rates for support vector machines using Gaussian kernels
- Fast learning rates for plug-in classifiers
- A decision-theoretic generalization of on-line learning and an application to boosting
- Smooth discrimination analysis
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Mixing strategies for density estimation.
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Functional aggregation for nonparametric regression.
- Analyzing bagging
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Support-vector networks
- Statistical performance of support vector machines
- Aggregation for Gaussian regression
- Simultaneous adaptation to the margin and to complexity in classification
- Model selection via testing: an alternative to (penalized) maximum likelihood estimators.
- Theory of Classification: a Survey of Some Recent Advances
- Minimax nonparametric classification. II. Model selection for adaptation
- Minimax nonparametric classification .I. Rates of convergence
- 10.1162/1532443041424319
- Classification with reject option
- Learning Theory and Kernel Machines
- Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition
- Suboptimality of Penalized Empirical Risk Minimization in Classification
- Learning Theory
- Convexity, Classification, and Risk Bounds
- Some applications of concentration inequalities to statistics
This page was built for publication: Optimal rates of aggregation in classification under low noise assumption