Robust Loss Functions for Boosting
From MaRDI portal
Publication:5440967
DOI10.1162/neco.2007.19.8.2183zbMath1143.68544OpenAlexW2051570664WikidataQ51912448 ScholiaQ51912448MaRDI QIDQ5440967
Shinto Eguchi, Noboru Murata, Takafumi Kanamori, Takashi Takenouchi
Publication date: 5 February 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.8.2183
Learning and adaptive systems in artificial intelligence (68T05) Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20)
Related Items (max. 100)
Robust estimation in regression and classification methods for large dimensional data ⋮ On a robust gradient boosting scheme based on aggregation functions insensitive to outliers ⋮ Entropy and divergence associated with power function and the statistical application ⋮ Deformation of log-likelihood loss function for multiclass boosting ⋮ A boosting method with asymmetric mislabeling probabilities which depend on covariates ⋮ Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions ⋮ An Extension of the Receiver Operating Characteristic Curve and AUC-Optimal Classification
Cites Work
- Robust inference with binary data
- A decision-theoretic generalization of on-line learning and an application to boosting
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Robustifying AdaBoost by Adding the Naive Error Rate
- 10.1162/153244304773936072
- Information Geometry of U-Boost and Bregman Divergence
- Soft margins for AdaBoost
- Sparse regression ensembles in infinite and finite hypothesis spaces
- Linear programming boosting via column generation
This page was built for publication: Robust Loss Functions for Boosting