A simple extension of boosting for asymmetric mislabeled data
From MaRDI portal
Publication:419240
DOI10.1016/J.SPL.2011.10.014zbMATH Open1237.62081OpenAlexW1985400179MaRDI QIDQ419240FDOQ419240
Authors: Kenichi Hayashi
Publication date: 18 May 2012
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2011.10.014
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Statistical tables (62Q05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Greedy function approximation: A gradient boosting machine.
- A note on margin-based loss functions in classification
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting algorithms: regularization, prediction and model fitting
- Robustifying AdaBoost by Adding the Naive Error Rate
- On the Bayes-risk consistency of regularized boosting methods.
- AdaBoost is consistent
Cited In (3)
Uses Software
This page was built for publication: A simple extension of boosting for asymmetric mislabeled data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q419240)