Soft-max boosting
From MaRDI portal
Publication:747255
DOI10.1007/S10994-015-5491-2zbMATH Open1341.68155OpenAlexW1970620712MaRDI QIDQ747255FDOQ747255
Authors: Matthieu Geist
Publication date: 23 October 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-015-5491-2
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Title not available (Why is that?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An adaptive version of the boost by majority algorithm
- Boosting a weak learning algorithm by majority
- Boosting in the presence of noise
- Boosting. Foundations and algorithms.
- Convexity, Classification, and Risk Bounds
- How to compare different loss functions and their risks
- Improved boosting algorithms using confidence-rated predictions
- Learning Theory
- Multi-class AdaBoost
- Multicategory Support Vector Machines
- Quasi-orthogonality with applications to some families of classical orthogonal polynomials.
- Random classification noise defeats all convex potential boosters
- Scikit-learn: machine learning in Python
- Statistical analysis of some multi-category large margin classification methods
- Stochastic gradient boosting.
Uses Software
This page was built for publication: Soft-max boosting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q747255)