Soft-max boosting
From MaRDI portal
Publication:747255
DOI10.1007/s10994-015-5491-2zbMath1341.68155OpenAlexW1970620712MaRDI QIDQ747255
Publication date: 23 October 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-015-5491-2
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multi-class AdaBoost
- A decision-theoretic generalization of on-line learning and an application to boosting
- Quasi-orthogonality with applications to some families of classical orthogonal polynomials.
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting a weak learning algorithm by majority
- Random classification noise defeats all convex potential boosters
- Improved boosting algorithms using confidence-rated predictions
- How to compare different loss functions and their risks
- Learning Theory
- Multicategory Support Vector Machines
- Convexity, Classification, and Risk Bounds
- Boosting in the presence of noise
- An adaptive version of the boost by majority algorithm
- Stochastic gradient boosting.
This page was built for publication: Soft-max boosting