A geometric approach to leveraging weak learners
From MaRDI portal
Publication:1603593
DOI10.1016/S0304-3975(01)00083-4zbMath0997.68166MaRDI QIDQ1603593
Nigel Duffy, David P. Helmbold
Publication date: 15 July 2002
Published in: Theoretical Computer Science (Search for Journal in Brave)
Related Items (7)
Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. ⋮ Random classification noise defeats all convex potential boosters ⋮ The synergy between PAV and AdaBoost ⋮ Analysis of boosting algorithms using the smooth margin function ⋮ Greedy function approximation: A gradient boosting machine. ⋮ The synergy between PAV and AdaBoost ⋮ Improving nonparametric regression methods by bagging and boosting.
Uses Software
Cites Work
- Unnamed Item
- Bagging predictors
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Equivalence of models for polynomial learnability
- A decision-theoretic generalization of on-line learning and an application to boosting
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting a weak learning algorithm by majority
- Improved boosting algorithms using confidence-rated predictions
- An approach to nonlinear programming
- A theory of the learnable
- Learning Boolean formulas
- Prediction Games and Arcing Algorithms
- Soft margins for AdaBoost
- General convergence results for linear discriminant updates
- Stochastic gradient boosting.
This page was built for publication: A geometric approach to leveraging weak learners