Random classification noise defeats all convex potential boosters
From MaRDI portal
Publication:1959553
DOI10.1007/s10994-009-5165-zzbMath1470.68139WikidataQ56114422 ScholiaQ56114422MaRDI QIDQ1959553
Philip M. Long, Rocco A. Servedio
Publication date: 7 October 2010
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-009-5165-z
learning theory; boosting; convex loss; noise-tolerant learning; misclassification noise; potential boosting
62H30: Classification and discrimination; cluster analysis (statistical aspects)
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Uses Software
Cites Work
- Unnamed Item
- A decision-theoretic generalization of on-line learning and an application to boosting
- A geometric approach to leveraging weak learners
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Population theory for boosting ensembles.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Boosting a weak learning algorithm by majority
- Improved boosting algorithms using confidence-rated predictions
- Boosting with early stopping: convergence and consistency
- 10.1162/153244304773936072
- 10.1162/153244304773936108
- Learning Theory
- Boosting in the presence of noise
- Soft margins for AdaBoost
- An adaptive version of the boost by majority algorithm