Population theory for boosting ensembles.
From MaRDI portal
Publication:1884600
DOI10.1214/aos/1079120126zbMath1105.62308OpenAlexW2035446310MaRDI QIDQ1884600
Publication date: 5 November 2004
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1079120126
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15)
Related Items
InfoGram and admissible machine learning, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint, Infinitesimal gradient boosting, Accelerated gradient boosting, Ranking and empirical minimization of \(U\)-statistics, Parallel hierarchical sampling: a general-purpose interacting Markov chains Monte Carlo algorithm, Comment on: Boosting algorithms: regularization, prediction and model fitting, Random classification noise defeats all convex potential boosters, A comparison of classification models to identify the Fragile X Syndrome, On the consistency of multi-label learning, Surprising properties of dropout in deep networks, Remembering Leo Breiman, Selection of Binary Variables and Classification by Boosting, Theory of Classification: a Survey of Some Recent Advances, Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions, Boosted nonparametric hazards with time-dependent covariates, A stochastic approximation view of boosting, AdaBoost and robust one-bit compressed sensing, Boosting with early stopping: convergence and consistency, A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\), Random forest estimation of conditional distribution functions and conditional quantiles, Optimization by Gradient Boosting
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Improved boosting algorithms using confidence-rated predictions
- Variance reduction trends on `boosted' classifiers
- Boosting with early stopping: convergence and consistency
- Boosting With theL2Loss