Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
From MaRDI portal
Publication:2284380
DOI10.1214/18-AOS1784zbMath1436.62107OpenAlexW2982542510WikidataQ114599286 ScholiaQ114599286MaRDI QIDQ2284380
Max G'sell, Alessandro Rinaldo, Larry Alan Wasserman
Publication date: 15 January 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1572487399
Linear regression; mixed models (62J05) Foundations and philosophical topics in statistics (62A01) Bootstrap, jackknife and other resampling methods (62F40)
Related Items
Post-model-selection inference in linear regression models: an integrated review, Mixed-effect models with trees, Bootstrapping some GLM and survival regression variable selection estimators, Comment: Statistical inference from a predictive perspective, UNIFORM-IN-SUBMODEL BOUNDS FOR LINEAR REGRESSION IN A MODEL-FREE FRAMEWORK, Synthetic learner: model-free inference on treatments over time, Dimension-agnostic inference using cross U-statistics, Post-selection inference via algorithmic stability, Rejoinder: On nearly assumption-free tests of nominal confidence interval coverage for causal parameters estimated by machine learning, Two-directional simultaneous inference for high-dimensional models, Testing conditional independence in supervised learning algorithms, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices, Averaging \(p\)-values under exchangeability
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Distribution-Free Predictive Inference For Regression
- Optimal-order bounds on the rate of convergence to normality in the multivariate delta method
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Random design analysis of ridge regression
- High-dimensional inference in misspecified linear models
- Bounds for the normal approximation of the maximum likelihood estimator
- A central limit theorem applicable to robust regression estimators
- High-dimensional variable selection
- Controlling the false discovery rate via knockoffs
- Stein's method for nonlinear statistics: a brief survey and recent progress
- Bootstrap consistency for quadratic forms of sample averages with increasing dimension
- Lower bounds for the rate of convergence in the central limit theorem in Banach spaces
- Uniform asymptotic inference and the bootstrap after model selection
- High-dimensional simultaneous inference with the bootstrap
- Selective inference with a randomized response
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Honest confidence regions for nonparametric regression
- Uniformly valid confidence intervals post-model-selection
- Multivariate normal approximation of the maximum likelihood estimator via the delta method
- Models as approximations. I. Consequences illustrated with linear regression
- A significance test for the lasso
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Comparison and anti-concentration bounds for maxima of Gaussian random vectors
- Central limit theorems and bootstrap in high dimensions
- Valid confidence intervals for post-model-selection predictors
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- Confidence sets in sparse regression
- Normal approximation for nonlinear statistics using a concentration inequality approach
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Bounds for the asymptotic normality of the maximum likelihood estimator using the Delta method
- p-Values for High-Dimensional Regression
- Bootstrapping Lasso Estimators
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- Hybrid and Size-Corrected Subsampling Methods
- A note on data-splitting for the evaluation of significance levels
- Frequentist Model Average Estimators
- Probability for Statisticians
- Goodness-of-Fit Tests for High Dimensional Linear Models
- Stability Selection
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Estimation and Accuracy After Model Selection
- Variable Selection with Error Control: Another Look at Stability Selection
- Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
- Linear Model Selection by Cross-Validation
- Group Bound: Confidence Intervals for Groups of Variables in Sparse High Dimensional Regression Without Assumptions on the Design
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Discussion: ``A significance test for the lasso