Valid post-selection inference in model-free linear regression
DOI10.1214/19-AOS1917zbMath1455.62137OpenAlexW3088350445MaRDI QIDQ2215767
Arun Kumar Kuchibhotla, Edward I. George, Linda H. Zhao, Junhui Cai, Lawrence D. Brown, Andreas Buja
Publication date: 14 December 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1600480938
model selectionsimultaneous inferenceconcentration inequalitiesuniform consistencymultiplier bootstraphigh-dimensional linear regressionOrlicz norms
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Parametric tolerance and confidence regions (62F25) Linear regression; mixed models (62J05) Bootstrap, jackknife and other resampling methods (62F40) Analysis of variance and covariance (ANOVA) (62J10) Paired and multiple comparisons; multiple testing (62J15)
Related Items
Cites Work
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Using i.i.d. bootstrap inference for general non-i.i.d. models
- On the post selection inference constant under restricted isometry properties
- Uniform asymptotic inference and the bootstrap after model selection
- Sparse estimation of high-dimensional correlation matrices
- Uniformly valid confidence intervals post-model-selection
- Models as approximations. I. Consequences illustrated with linear regression
- Central limit theorems and bootstrap in high dimensions
- Valid confidence intervals for post-model-selection predictors
- The Conditional Level of the F-Test
- Inflation of R 2 in Best Subset Regression
- Frequentist Model Average Estimators
- An asymptotic theory for model selection inference in general semiparametric problems
- Note on a Conditional Property of Student's $t^1$
- Linear and Conic Programming Estimators in High Dimensional Errors-in-variables Models