Using iterated bagging to debias regressions
From MaRDI portal
Publication:5959948
DOI10.1023/A:1017934522171zbMath1052.68109MaRDI QIDQ5959948
Publication date: 11 April 2002
Published in: Machine Learning (Search for Journal in Brave)
Related Items (26)
COVID-19 pandemic forecasting using CNN-LSTM: a hybrid approach ⋮ ESG score prediction through random forest algorithm ⋮ Properties of Bagged Nearest Neighbour Classifiers ⋮ On bagging and nonlinear estimation ⋮ Two-level quantile regression forests for bias correction in range prediction ⋮ Committee polyhedral separability: complexity and polynomial approximation ⋮ Pricing Bermudan Options Using Regression Trees/Random Forests ⋮ Unnamed Item ⋮ Derandomizing Knockoffs ⋮ Bias-corrected random forests in regression ⋮ AN EFFECTIVE BIAS-CORRECTED BAGGING METHOD FOR THE VALUATION OF LARGE VARIABLE ANNUITY PORTFOLIOS ⋮ Cross-validated bagged learning ⋮ A Data-Driven Random Subfeature Ensemble Learning Algorithm for Weather Forecasting ⋮ Boosting and instability for regression trees ⋮ Using boosting to prune double-bagging ensembles ⋮ Remembering Leo Breiman ⋮ Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data ⋮ Rotation Forests for regression ⋮ Stochastic gradient boosting. ⋮ Looking for lumps: boosting and bagging for density estimation. ⋮ Improving nonparametric regression methods by bagging and boosting. ⋮ Delta Boosting Machine with Application to General Insurance ⋮ A long short-term memory ensemble approach for improving the outcome prediction in intensive care unit ⋮ Optimal weighted nearest neighbour classifiers ⋮ Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) ⋮ On weak base hypotheses and their implications for boosting regression and classification
This page was built for publication: Using iterated bagging to debias regressions