Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
From MaRDI portal
Publication:5476689
DOI10.1162/neco.2006.18.7.1678zbMath1115.68488OpenAlexW2121043290WikidataQ51940163 ScholiaQ51940163MaRDI QIDQ5476689
Durga L. Shrestha, Dimitri Solomatine
Publication date: 17 July 2006
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2006.18.7.1678
Related Items (10)
Fast decorrelated neural network ensembles with random weights ⋮ PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection ⋮ A robust AdaBoost.RT based ensemble extreme learning machine ⋮ On a method for constructing ensembles of regression models ⋮ An empirical study of using Rotation Forest to improve regressors ⋮ Using boosting to prune double-bagging ensembles ⋮ Modular learning models in forecasting natural phenomena. ⋮ Rotation Forests for regression ⋮ Forecasting financial and macroeconomic variables using data reduction methods: new empirical evidence ⋮ Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling
Uses Software
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Multivariate adaptive regression splines
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- A theory of the learnable
This page was built for publication: Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression