Boosting methods for regression
From MaRDI portal
Publication:5959971
DOI10.1023/A:1013685603443zbMATH Open0998.68113OpenAlexW2159624360MaRDI QIDQ5959971FDOQ5959971
Authors: Nigel Duffy, David P. Helmbold
Publication date: 11 April 2002
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1013685603443
Recommendations
Cited In (26)
- Improving nonparametric regression methods by bagging and boosting.
- Title not available (Why is that?)
- A new perspective on boosting in linear regression via subgradient optimization and relatives
- An $L_{2}$-Boosting Algorithm for Estimation of a Regression Function
- Title not available (Why is that?)
- Large-Margin Thresholded Ensembles for Ordinal Regression: Theory and Practice
- Robust regression by boosting the median.
- Quadratic boosting
- Boosting and instability for regression trees
- Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
- Boosting with missing predictors
- Title not available (Why is that?)
- Boosting of granular models
- An empirical study of using Rotation Forest to improve regressors
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Boosting in structured additive models.
- Training regression ensembles by sequential target correction and resampling
- On a method for constructing ensembles of regression models
- On boosting kernel regression
- Using boosting to prune double-bagging ensembles
- Boosting With theL2Loss
- Boosting algorithms: regularization, prediction and model fitting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Boosting conditional probability estimators
- Sparse regression ensembles in infinite and finite hypothesis spaces
This page was built for publication: Boosting methods for regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5959971)