Robustifying AdaBoost by Adding the Naive Error Rate
From MaRDI portal
Recommendations
Cites work
- A class of logistic-type discriminant functions
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Sanov property, generalized I-projection and a conditional limit theorem
- Soft margins for AdaBoost
Cited in
(17)- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- Information Geometry of U-Boost and Bregman Divergence
- A simple extension of boosting for asymmetric mislabeled data
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Normalized estimating equation for robust parameter estimation
- Binary classification with a pseudo exponential model and its application for multi-task learning
- Duality of maximum entropy and minimum divergence
- scientific article; zbMATH DE number 1931839 (Why is no real title available?)
- Entropy and divergence associated with power function and the statistical application
- A modified EM algorithm for mixture models based on Bregman divergence
- A boosting method with asymmetric mislabeling probabilities which depend on covariates
- Deformation of log-likelihood loss function for multiclass boosting
- Robust mislabel logistic regression without modeling mislabel probabilities
- Robust Loss Functions for Boosting
- A Multiclass Classification Method Based on Decoding of Binary Classifiers
- An estimation of generalized Bradley-Terry models based on the EM algorithm
- A boosting method for maximization of the area under the ROC curve
This page was built for publication: Robustifying AdaBoost by Adding the Naive Error Rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4819815)