Rejoinder: Boosting algorithms: regularization, prediction and model fitting
From MaRDI portal
Abstract: Rejoinder to ``Boosting Algorithms: Regularization, Prediction and Model Fitting [arXiv:0804.2752]
Recommendations
Cites work
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Conditionally Unbiased Bounded-Influence Estimation in General Regression Models, with Applications to Generalized Linear Models
- Greedy function approximation: A gradient boosting machine.
- Least angle regression. (With discussion)
- Random forests
- Stochastic gradient boosting.
Cited in
(13)- Prediction-based variable selection for component-wise gradient boosting
- Boosting algorithms: regularization, prediction and model fitting
- Boosting with missing predictors
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An $L_{2}$-Boosting Algorithm for Estimation of a Regression Function
- On the practice of rescaling covariates
- An update on statistical boosting in biomedicine
- Interpreting initial offset boosting via reconstitution in integral domain
- Boosting With theL2Loss
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Model-based boosting 2.0
- Subject-specific Bradley–Terry–Luce models with implicit variable selection
- Detection of differential item functioning in Rasch models by boosting techniques
This page was built for publication: Rejoinder: Boosting algorithms: regularization, prediction and model fitting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449785)