Debiasing the debiased Lasso with bootstrap
From MaRDI portal
Publication:2192302
Abstract: We consider statistical inference for a single coordinate of regression coefficients in high-dimensional linear models. Recently, the debiased estimators are popularly used for constructing confidence intervals and hypothesis testing in high-dimensional models. However, some representative numerical experiments show that they tend to be biased for large coefficients, especially when the number of large coefficients dominates the number of small coefficients. In this paper, we propose a modified debiased Lasso estimator based on bootstrap. Let us denote the proposed estimator BS-DB for short. We show that, under the irrepresentable condition and other mild technical conditions, the BS-DB has smaller order of bias than the debiased Lasso in existence of a large proportion of strong signals. If the irrepresentable condition does not hold, the BS-DB is guaranteed to perform no worse than the debiased Lasso asymptotically. Confidence intervals based on the BS-DB are proposed and proved to be asymptotically valid under mild conditions. Our study on the inference problems integrates the properties of the Lasso on variable selection and estimation novelly. The superior performance of the BS-DB over the debiased Lasso is demonstrated via extensive numerical studies.
Recommendations
- Debiasing the Lasso: optimal sample size for Gaussian designs
- On the asymptotic variance of the debiased Lasso
- Debiased lasso for generalized linear models with a diverging number of covariates
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- Posterior asymptotic normality for an individual coordinate in high-dimensional linear regression
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A significance test for the lasso
- Asymptotic properties of the residual bootstrap for lasso estimators
- Asymptotics for Lasso-type estimators.
- Bootstrap and wild bootstrap for high dimensional linear models
- Bootstrapping Lasso estimators
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence intervals for high-dimensional inverse covariance estimation
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Double/debiased machine learning for treatment and structural parameters
- Exact post-selection inference, with application to the Lasso
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional inference in misspecified linear models
- High-dimensional simultaneous inference with the bootstrap
- High-dimensional variable selection
- Inference on treatment effects after selection among high-dimensional controls
- Nearly unbiased variable selection under minimax concave penalty
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Scaled sparse linear regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Stability Selection
- Testing and Confidence Intervals for High Dimensional Proportional Hazards Models
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(12)- A bootstrap Lasso+partial ridge method to construct confidence intervals for parameters in high-dimensional sparse linear models
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- On the asymptotic variance of the debiased Lasso
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Variable selection in the Box-Cox power transformation model
- Debiased lasso for generalized linear models with a diverging number of covariates
- Assessing the Most Vulnerable Subgroup to Type II Diabetes Associated with Statin Usage: Evidence from Electronic Health Record Data
- Overview of debiased Lasso in high-dimensional linear model
- Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage
- Debiasing convex regularized estimators and interval estimation in linear models
- The de-biased group Lasso estimation for varying coefficient models
- Estimation and Inference for High-Dimensional Generalized Linear Models with Knowledge Transfer
This page was built for publication: Debiasing the debiased Lasso with bootstrap
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2192302)