De-biasing the Lasso with degrees-of-freedom adjustment
From MaRDI portal
Publication:2136990
DOI10.3150/21-BEJ1348MaRDI QIDQ2136990
Publication date: 16 May 2022
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.08885
efficiencysemiparametric modelFisher informationstatistical inferenceconfidence intervalhigh-dimensional dataregression\(p\)-value
Related Items (6)
De-biasing the Lasso with degrees-of-freedom adjustment ⋮ Debiasing convex regularized estimators and interval estimation in linear models ⋮ Universality of regularized regression estimators in high dimensions ⋮ The Lasso with general Gaussian designs with applications to hypothesis testing ⋮ Spatially relaxed inference on high-dimensional linear models ⋮ Asymptotic normality of robust \(M\)-estimators with convex penalty
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Statistical significance in high-dimensional linear models
- Degrees of freedom in lasso problems
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- On asymptotically efficient estimation in semiparametric models
- Significance testing in non-sparse high-dimensional linear models
- Least squares after model selection in high-dimensional sparse models
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- De-biasing the Lasso with degrees-of-freedom adjustment
- Pivotal estimation via square-root lasso in nonparametric regression
- Estimation of sums of random variables: examples and information bounds
- Simultaneous analysis of Lasso and Dantzig selector
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- On the ``degrees of freedom of the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Linear Hypothesis Testing in Dense High-Dimensional Linear Models
- Scaled sparse linear regression
- Decoding by Linear Programming
- Just relax: convex programming methods for identifying sparse signals in noise
- Asymptotic Statistics
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: De-biasing the Lasso with degrees-of-freedom adjustment