Cost-sensitive boosting algorithms: do we really need them?
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 193111 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting as a regularized path to a maximum margin classifier
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Cost-sensitive boosting for classification of imbalanced data
- Explaining AdaBoost
- Improved boosting algorithms using confidence-rated predictions
- Machine learning. The art and science of algorithms that make sense of data.
- Paradoxes of fuzzy logic, revisited
- Statistical comparisons of classifiers over multiple data sets
- Training Products of Experts by Minimizing Contrastive Divergence
Cited in
(8)- Adaptive stochastic gradient boosting tree with composite criterion
- Cost-sensitive ensemble learning: a unifying framework
- Predicting mortgage early delinquency with machine learning methods
- Cost-sensitive thresholding over a two-dimensional decision region for fraud detection
- Boosted classification trees and class probability/quantile estimation
- Cost-sensitive boosting for classification of imbalanced data
- Beyond sigmoids: how to obtain well-calibrated probabilities from binary classifiers with beta calibration
- Cost-sensitive learning and decision making revisited
This page was built for publication: Cost-sensitive boosting algorithms: do we really need them?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q331693)