Robust Loss Functions for Boosting
DOI10.1162/NECO.2007.19.8.2183zbMATH Open1143.68544DBLPjournals/neco/KanamoriTEM07OpenAlexW2051570664WikidataQ51912448 ScholiaQ51912448MaRDI QIDQ5440967FDOQ5440967
Authors: Takafumi Kanamori, Takashi Takenouchi, Shinto Eguchi, Noboru Murata
Publication date: 5 February 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.8.2183
Recommendations
- Robust boosting with truncated loss functions
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- Robust boosting for regression problems
- On a robust gradient boosting scheme based on aggregation functions insensitive to outliers
- A ROBUST BOOSTING METHOD FOR MISLABELED DATA
Learning and adaptive systems in artificial intelligence (68T05) Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Soft margins for AdaBoost
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Information Geometry of U-Boost and Bregman Divergence
- Robustifying AdaBoost by Adding the Naive Error Rate
- Linear programming boosting via column generation
- Robust inference with binary data
- 10.1162/153244304773936072
- Sparse regression ensembles in infinite and finite hypothesis spaces
Cited In (16)
- Robust ELM model with truncated 1-norm loss function
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- Robustifying AdaBoost by Adding the Naive Error Rate
- Online boosting algorithms based on exponential and 0-1 loss
- Robust estimation in regression and classification methods for large dimensional data
- Entropy and divergence associated with power function and the statistical application
- Learned-loss boosting
- Robust boosting with truncated loss functions
- Loss robustness via Fisher-weighted squared-error loss function
- A boosting method with asymmetric mislabeling probabilities which depend on covariates
- An extension of the receiver operating characteristic curve and AUC-optimal classification
- Deformation of log-likelihood loss function for multiclass boosting
- On a robust gradient boosting scheme based on aggregation functions insensitive to outliers
- Robust boosting for regression problems
- A correction on TangentBoost algorithm
- A Tunable Loss Function for Robust Classification: Calibration, Landscape, and Generalization
This page was built for publication: Robust Loss Functions for Boosting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5440967)