Using iterated bagging to debias regressions

From MaRDI portal
Revision as of 01:25, 30 January 2024 by Import240129110155 (talk | contribs) (Created automatically from import240129110155)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5959948

DOI10.1023/A:1017934522171zbMath1052.68109MaRDI QIDQ5959948

Leo Breiman

Publication date: 11 April 2002

Published in: Machine Learning (Search for Journal in Brave)




Related Items (26)

COVID-19 pandemic forecasting using CNN-LSTM: a hybrid approachESG score prediction through random forest algorithmProperties of Bagged Nearest Neighbour ClassifiersOn bagging and nonlinear estimationTwo-level quantile regression forests for bias correction in range predictionCommittee polyhedral separability: complexity and polynomial approximationPricing Bermudan Options Using Regression Trees/Random ForestsUnnamed ItemDerandomizing KnockoffsBias-corrected random forests in regressionAN EFFECTIVE BIAS-CORRECTED BAGGING METHOD FOR THE VALUATION OF LARGE VARIABLE ANNUITY PORTFOLIOSCross-validated bagged learningA Data-Driven Random Subfeature Ensemble Learning Algorithm for Weather ForecastingBoosting and instability for regression treesUsing boosting to prune double-bagging ensemblesRemembering Leo BreimanDetermining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray dataRotation Forests for regressionStochastic gradient boosting.Looking for lumps: boosting and bagging for density estimation.Improving nonparametric regression methods by bagging and boosting.Delta Boosting Machine with Application to General InsuranceA long short-term memory ensemble approach for improving the outcome prediction in intensive care unitOptimal weighted nearest neighbour classifiersAdditive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)On weak base hypotheses and their implications for boosting regression and classification







This page was built for publication: Using iterated bagging to debias regressions