Soft-max boosting
From MaRDI portal
Publication:747255
DOI10.1007/S10994-015-5491-2zbMATH Open1341.68155OpenAlexW1970620712MaRDI QIDQ747255FDOQ747255
Authors: Matthieu Geist
Publication date: 23 October 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-015-5491-2
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Multicategory Support Vector Machines
- An adaptive version of the boost by majority algorithm
- Multi-class AdaBoost
- Stochastic gradient boosting.
- Improved boosting algorithms using confidence-rated predictions
- Convexity, Classification, and Risk Bounds
- Quasi-orthogonality with applications to some families of classical orthogonal polynomials.
- Boosting a weak learning algorithm by majority
- Boosting. Foundations and algorithms.
- Random classification noise defeats all convex potential boosters
- Statistical analysis of some multi-category large margin classification methods
- How to compare different loss functions and their risks
- Learning Theory
- Boosting in the presence of noise
Uses Software
This page was built for publication: Soft-max boosting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q747255)