Debiasing the debiased Lasso with bootstrap
From MaRDI portal
Publication:2192302
DOI10.1214/20-EJS1713zbMath1445.62185arXiv1711.03613MaRDI QIDQ2192302
Publication date: 17 August 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.03613
Ridge regression; shrinkage estimators (Lasso) (62J07) Nonparametric tolerance and confidence regions (62G15) Nonparametric statistical resampling methods (62G09)
Related Items
Assessing the Most Vulnerable Subgroup to Type II Diabetes Associated with Statin Usage: Evidence from Electronic Health Record Data ⋮ Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage ⋮ Variable selection in the Box-Cox power transformation model
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- Confidence intervals for high-dimensional inverse covariance estimation
- The Adaptive Lasso and Its Oracle Properties
- Exact post-selection inference, with application to the Lasso
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- High-dimensional inference in misspecified linear models
- High-dimensional variable selection
- High-dimensional simultaneous inference with the bootstrap
- Asymptotics for Lasso-type estimators.
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- A significance test for the lasso
- Bootstrap and wild bootstrap for high dimensional linear models
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Bootstrapping Lasso Estimators
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Testing and Confidence Intervals for High Dimensional Proportional Hazards Models
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- Stability Selection
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Double/debiased machine learning for treatment and structural parameters
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models