Robustifying AdaBoost by Adding the Naive Error Rate
From MaRDI portal
Publication:4819815
DOI10.1162/089976604322860695zbMath1097.68608OpenAlexW2105231256WikidataQ52001342 ScholiaQ52001342MaRDI QIDQ4819815
Takashi Takenouchi, Shinto Eguchi
Publication date: 5 October 2004
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976604322860695
Related Items (16)
Binary classification with a pseudo exponential model and its application for multi-task learning ⋮ Duality of maximum entropy and minimum divergence ⋮ Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm ⋮ Information Geometry of U-Boost and Bregman Divergence ⋮ A modified EM algorithm for mixture models based on Bregman divergence ⋮ A Multiclass Classification Method Based on Decoding of Binary Classifiers ⋮ Robust Boosting Algorithm Against Mislabeling in Multiclass Problems ⋮ A simple extension of boosting for asymmetric mislabeled data ⋮ Normalized estimating equation for robust parameter estimation ⋮ A boosting method for maximization of the area under the ROC curve ⋮ Entropy and divergence associated with power function and the statistical application ⋮ An Estimation of Generalized Bradley-Terry Models Based on the em Algorithm ⋮ Robust Loss Functions for Boosting ⋮ Deformation of log-likelihood loss function for multiclass boosting ⋮ A boosting method with asymmetric mislabeling probabilities which depend on covariates ⋮ Robust mislabel logistic regression without modeling mislabel probabilities
Cites Work
- Sanov property, generalized I-projection and a conditional limit theorem
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- A class of logistic-type discriminant functions
- Soft margins for AdaBoost
This page was built for publication: Robustifying AdaBoost by Adding the Naive Error Rate