Random classification noise defeats all convex potential boosters
From MaRDI portal
Publication:1959553
DOI10.1007/s10994-009-5165-zzbMath1470.68139WikidataQ56114422 ScholiaQ56114422MaRDI QIDQ1959553
Rocco A. Servedio, Philip M. Long
Publication date: 7 October 2010
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-009-5165-z
learning theory; boosting; convex loss; noise-tolerant learning; misclassification noise; potential boosting
62H30: Classification and discrimination; cluster analysis (statistical aspects)
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Unnamed Item, Surprising properties of dropout in deep networks, Unnamed Item, Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions, A Framework of Learning Through Empirical Gain Maximization, Robust Support Vector Machines for Classification with Nonconvex and Smooth Losses, Classification with asymmetric label noise: consistency and maximal denoising, The risk of trivial solutions in bipartite top ranking, Soft-max boosting, Binary classification with corrupted labels, On the noise estimation statistics, Robustness of learning algorithms using hinge loss with outlier indicators, A non-intrusive correction algorithm for classification problems with corrupted data, Robust Algorithms via PAC-Bayes and Laplace Distributions
Uses Software
Cites Work
- Unnamed Item
- A decision-theoretic generalization of on-line learning and an application to boosting
- A geometric approach to leveraging weak learners
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Population theory for boosting ensembles.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Boosting a weak learning algorithm by majority
- Improved boosting algorithms using confidence-rated predictions
- Boosting with early stopping: convergence and consistency
- 10.1162/153244304773936072
- 10.1162/153244304773936108
- Learning Theory
- Boosting in the presence of noise
- Soft margins for AdaBoost
- An adaptive version of the boost by majority algorithm