Measuring the Algorithmic Convergence of Randomized Ensembles: The Regression Setting
From MaRDI portal
Publication:5037548
DOI10.1137/20M1343300zbMath1490.62161arXiv1908.01251MaRDI QIDQ5037548
Suofei Wu, Miles E. Lopes, Thomas C. M. Lee
Publication date: 1 March 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.01251
62H30: Classification and discrimination; cluster analysis (statistical aspects)
62G09: Nonparametric statistical resampling methods
68T05: Learning and adaptive systems in artificial intelligence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Correlation and variable importance in random forests
- Bagging predictors
- BART: Bayesian additive regression trees
- On the asymptotics of random forests
- Isoperimetry and integrability of the sum of independent Banach-space valued random variables
- Estimating the algorithmic variance of randomized ensembles via the bootstrap
- Optimal weighted nearest neighbour classifiers
- Standard errors for bagged and random forest estimators
- Extrapolation methods theory and practice
- Second-order properties of an extrapolated bootstrap without replacement under weak assumptions
- Analyzing bagging
- How large should ensembles of classifiers be?
- Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data
- Variable importance in binary regression trees and forests
- Consistency of random forests
- Comments on: ``A random forest guided tour
- ggplot2
- Random Forests and Kernel Methods
- Scalable statistical inference for averaged implicit stochastic gradient descent
- Richardson Extrapolation and the Bootstrap
- Practical Extrapolation Methods
- Extrapolation of subsampling distribution estimators: The i.i.d. and strong mixing cases
- Properties of Bagged Nearest Neighbour Classifiers
- Random-projection Ensemble Classification
- Random Forests and Adaptive Nearest Neighbors
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests