Debiasing the debiased Lasso with bootstrap
From MaRDI portal
Publication:2192302
DOI10.1214/20-EJS1713zbMATH Open1445.62185arXiv1711.03613MaRDI QIDQ2192302FDOQ2192302
Publication date: 17 August 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Abstract: We consider statistical inference for a single coordinate of regression coefficients in high-dimensional linear models. Recently, the debiased estimators are popularly used for constructing confidence intervals and hypothesis testing in high-dimensional models. However, some representative numerical experiments show that they tend to be biased for large coefficients, especially when the number of large coefficients dominates the number of small coefficients. In this paper, we propose a modified debiased Lasso estimator based on bootstrap. Let us denote the proposed estimator BS-DB for short. We show that, under the irrepresentable condition and other mild technical conditions, the BS-DB has smaller order of bias than the debiased Lasso in existence of a large proportion of strong signals. If the irrepresentable condition does not hold, the BS-DB is guaranteed to perform no worse than the debiased Lasso asymptotically. Confidence intervals based on the BS-DB are proposed and proved to be asymptotically valid under mild conditions. Our study on the inference problems integrates the properties of the Lasso on variable selection and estimation novelly. The superior performance of the BS-DB over the debiased Lasso is demonstrated via extensive numerical studies.
Full work available at URL: https://arxiv.org/abs/1711.03613
Recommendations
- Debiasing the Lasso: optimal sample size for Gaussian designs
- On the asymptotic variance of the debiased Lasso
- Debiased lasso for generalized linear models with a diverging number of covariates
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- Posterior asymptotic normality for an individual coordinate in high-dimensional linear regression
Nonparametric statistical resampling methods (62G09) Nonparametric tolerance and confidence regions (62G15) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- High-dimensional simultaneous inference with the bootstrap
- Confidence intervals for high-dimensional inverse covariance estimation
- The Adaptive Lasso and Its Oracle Properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- Title not available (Why is that?)
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Double/debiased machine learning for treatment and structural parameters
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Asymptotics for Lasso-type estimators.
- A significance test for the lasso
- Bootstrap and wild bootstrap for high dimensional linear models
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- Exact post-selection inference, with application to the Lasso
- Bootstrapping Lasso Estimators
- Scaled sparse linear regression
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- High-dimensional variable selection
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Testing and Confidence Intervals for High Dimensional Proportional Hazards Models
- High-dimensional inference in misspecified linear models
- Debiasing the Lasso: optimal sample size for Gaussian designs
Cited In (5)
- Variable selection in the Box-Cox power transformation model
- Debiased lasso for generalized linear models with a diverging number of covariates
- Assessing the Most Vulnerable Subgroup to Type II Diabetes Associated with Statin Usage: Evidence from Electronic Health Record Data
- Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage
- Estimation and Inference for High-Dimensional Generalized Linear Models with Knowledge Transfer
Uses Software
This page was built for publication: Debiasing the debiased Lasso with bootstrap
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2192302)