Uniform asymptotic inference and the bootstrap after model selection
From MaRDI portal
Publication:1650078
DOI10.1214/17-AOS1584zbMath1392.62210arXiv1506.06266OpenAlexW2963312390MaRDI QIDQ1650078
Alessandro Rinaldo, Ryan J. Tibshirani, Larry Alan Wasserman, Robert Tibshirani
Publication date: 29 June 2018
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1506.06266
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Robustness and adaptive procedures (parametric inference) (62F35) Asymptotic properties of parametric tests (62F05)
Related Items (26)
On the post selection inference constant under restricted isometry properties ⋮ Exact post-selection inference, with application to the Lasso ⋮ Post-model-selection inference in linear regression models: an integrated review ⋮ Ridge regression revisited: debiasing, thresholding and bootstrap ⋮ Uniform asymptotic inference and the bootstrap after model selection ⋮ Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization ⋮ Uniformly valid confidence intervals post-model-selection ⋮ Optimal finite sample post-selection confidence distributions in generalized linear models ⋮ Post‐selection inference for changepoint detection algorithms with application to copy number variation data ⋮ Selective inference after feature selection via multiscale bootstrap ⋮ More Powerful Selective Inference for the Graph Fused Lasso ⋮ Bootstrapping some GLM and survival regression variable selection estimators ⋮ Selective Inference for Hierarchical Clustering ⋮ Forward-selected panel data approach for program evaluation ⋮ Carving model-free inference ⋮ Valid post-selection inference in model-free linear regression ⋮ Exact post-selection inference for the generalized Lasso path ⋮ Selective inference with a randomized response ⋮ Bootstrapping and sample splitting for high-dimensional, assumption-lean inference ⋮ Inference after estimation of breaks ⋮ Multicarving for high-dimensional post-selection inference ⋮ Selective inference for latent block models ⋮ Inference for \(L_2\)-boosting ⋮ Bootstrapping multiple linear regression after variable selection ⋮ Selective inference for additive and linear mixed models ⋮ High-dimensional statistical inference via DATE
Uses Software
Cites Work
- Inference in adaptive regression via the Kac-Rice formula
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Selecting the number of principal components: estimation of the true rank of a noisy matrix
- Can one estimate the conditional distribution of post-model-selection estimators?
- One-sided inference about functionals of a density
- Uniform asymptotic inference and the bootstrap after model selection
- A significance test for the lasso
- Uniformity and the delta method
- Valid confidence intervals for post-model-selection predictors
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- Bayes estimation subject to uncertainty about parameter constraints
- Asymptotic Statistics
- THE FINITE-SAMPLE DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS AND UNIFORM VERSUS NONUNIFORM APPROXIMATIONS
- Post‐selection point and interval estimation of signal sizes in Gaussian samples
- Asymptotics of Selective Inference
- Discussion: ``A significance test for the lasso
This page was built for publication: Uniform asymptotic inference and the bootstrap after model selection