Pages that link to "Item:Q5959948"
From MaRDI portal
The following pages link to Using iterated bagging to debias regressions (Q5959948):
Displaying 19 items.
- Remembering Leo Breiman (Q542912) (← links)
- Optimal weighted nearest neighbour classifiers (Q741805) (← links)
- COVID-19 pandemic forecasting using CNN-LSTM: a hybrid approach (Q832776) (← links)
- On bagging and nonlinear estimation (Q866611) (← links)
- Two-level quantile regression forests for bias correction in range prediction (Q890300) (← links)
- Committee polyhedral separability: complexity and polynomial approximation (Q890319) (← links)
- Boosting and instability for regression trees (Q959181) (← links)
- Using boosting to prune double-bagging ensembles (Q961263) (← links)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) (Q1848780) (← links)
- On weak base hypotheses and their implications for boosting regression and classification (Q1848929) (← links)
- Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data (Q2013966) (← links)
- Rotation Forests for regression (Q2016344) (← links)
- ESG score prediction through random forest algorithm (Q2155224) (← links)
- A long short-term memory ensemble approach for improving the outcome prediction in intensive care unit (Q2283786) (← links)
- Cross-validated bagged learning (Q2474238) (← links)
- AN EFFECTIVE BIAS-CORRECTED BAGGING METHOD FOR THE VALUATION OF LARGE VARIABLE ANNUITY PORTFOLIOS (Q5140083) (← links)
- A Data-Driven Random Subfeature Ensemble Learning Algorithm for Weather Forecasting (Q5162342) (← links)
- Properties of Bagged Nearest Neighbour Classifiers (Q5313456) (← links)
- Pricing Bermudan Options Using Regression Trees/Random Forests (Q6070674) (← links)