Soft-max boosting
From MaRDI portal
Publication:747255
Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An adaptive version of the boost by majority algorithm
- Boosting a weak learning algorithm by majority
- Boosting in the presence of noise
- Boosting. Foundations and algorithms.
- Convexity, Classification, and Risk Bounds
- How to compare different loss functions and their risks
- Improved boosting algorithms using confidence-rated predictions
- Learning Theory
- Multi-class AdaBoost
- Multicategory Support Vector Machines
- Quasi-orthogonality with applications to some families of classical orthogonal polynomials.
- Random classification noise defeats all convex potential boosters
- Scikit-learn: machine learning in Python
- Statistical analysis of some multi-category large margin classification methods
- Stochastic gradient boosting.
This page was built for publication: Soft-max boosting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q747255)