Boosting and instability for regression trees
From MaRDI portal
Publication:959181
DOI10.1016/j.csda.2004.09.001zbMath1431.62155OpenAlexW2039425303MaRDI QIDQ959181
Jean-Michel Poggi, Servane Gey
Publication date: 11 December 2008
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2004.09.001
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Prior Distributions for Item Parameters in IRT Models, Boosting and instability for regression trees, CART algorithm for spatial data: application to environmental and ecological data, Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography, Editorial: Machine learning and robust data mining, A stochastic approximation view of boosting, Trimmed bagging, Influence measures for CART classification trees, Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
Uses Software
Cites Work
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- Boosting and instability for regression trees
- Multivariate adaptive regression splines
- Heuristics of instability and stabilization in model selection
- A decision-theoretic generalization of on-line learning and an application to boosting
- Partial and recombined estimators for nonlinear additive models
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Analyzing bagging
- On the Bayes-risk consistency of regularized boosting methods.
- 10.1162/153244302760200704
- 10.1162/1532443041424319
- Random forests
- Improving nonparametric regression methods by bagging and boosting.
- Using iterated bagging to debias regressions