In defense of the indefensible: a very naïve approach to high-dimensional inference
From MaRDI portal
Publication:2075709
DOI10.1214/20-STS815MaRDI QIDQ2075709
Sen Zhao, Ali Shojaie, Daniela M. Witten
Publication date: 15 February 2022
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1705.05543
Related Items
Scalable inference for high-dimensional precision matrix ⋮ Post-model-selection inference in linear regression models: an integrated review ⋮ Subset Selection for Linear Mixed Models ⋮ Fast multiscale functional estimation in optimal EMG placement for robotic prosthesis controllers ⋮ Rotation to sparse loadings using \(L^p\) losses and related inference problems ⋮ THE LOW-VOLATILITY ANOMALY AND THE ADAPTIVE MULTI-FACTOR MODEL ⋮ Survival analysis of DNA mutation motifs with penalized proportional hazards ⋮ Network differential connectivity analysis
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- On various confidence intervals post-model-selection
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization
- High-dimensional variable selection
- Can one estimate the conditional distribution of post-model-selection estimators?
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Statistical analysis of network data. Methods and models
- Significance testing in non-sparse high-dimensional linear models
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- The Lasso problem and uniqueness
- On the conditions used to prove oracle results for the Lasso
- Least squares after model selection in high-dimensional sparse models
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional graphs and variable selection with the Lasso
- Variance estimation in high-dimensional linear models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Statistical learning and selective inference
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- p-Values for High-Dimensional Regression
- Scaled sparse linear regression
- Debiased Inference on Treatment Effect in a High-Dimensional Model
- PERFORMANCE LIMITS FOR ESTIMATORS OF THE RISK OR DISTRIBUTION OF SHRINKAGE-TYPE ESTIMATORS, AND SOME GENERAL LOWER RISK-BOUND RESULTS
- A study of error variance estimation in Lasso regression
- Just relax: convex programming methods for identifying sparse signals in noise
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- A note on data-splitting for the evaluation of significance levels
- THE FINITE-SAMPLE DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS AND UNIFORM VERSUS NONUNIFORM APPROXIMATIONS
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- On the Role of the Propensity Score in Efficient Semiparametric Estimation of Average Treatment Effects
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Graph estimation with joint additive models
- Efficient Estimation of Average Treatment Effects Using the Estimated Propensity Score
- Random Graphs
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- A General Framework for Weighted Gene Co-Expression Network Analysis
- A significance test for graph‐constrained estimation
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models