De-biasing the Lasso with degrees-of-freedom adjustment
From MaRDI portal
Publication:2136990
DOI10.3150/21-BEJ1348MaRDI QIDQ2136990FDOQ2136990
Authors: Yanyan Li
Publication date: 16 May 2022
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.08885
confidence intervalhigh-dimensional dataregressionsemiparametric modelefficiency\(p\)-valuestatistical inferenceFisher information
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- Significance testing in non-sparse high-dimensional linear models
- Asymptotic Statistics
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Least squares after model selection in high-dimensional sparse models
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Scaled sparse linear regression
- Statistical significance in high-dimensional linear models
- Title not available (Why is that?)
- Degrees of freedom in lasso problems
- Inference on treatment effects after selection among high-dimensional controls
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Decoding by Linear Programming
- Just relax: convex programming methods for identifying sparse signals in noise
- On the ``degrees of freedom of the lasso
- A general theory of concave regularization for high-dimensional sparse estimation problems
- On asymptotically efficient estimation in semiparametric models
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Pivotal estimation via square-root lasso in nonparametric regression
- Estimation of sums of random variables: examples and information bounds
- Linear hypothesis testing in dense high-dimensional linear models
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- De-biasing the Lasso with degrees-of-freedom adjustment
Cited In (8)
- De-Biasing The Lasso With Degrees-of-Freedom Adjustment
- Asymptotic normality of robust \(M\)-estimators with convex penalty
- Debiased lasso for generalized linear models with a diverging number of covariates
- Universality of regularized regression estimators in high dimensions
- Debiasing convex regularized estimators and interval estimation in linear models
- Spatially relaxed inference on high-dimensional linear models
- De-biasing the Lasso with degrees-of-freedom adjustment
- The Lasso with general Gaussian designs with applications to hypothesis testing
Uses Software
This page was built for publication: De-biasing the Lasso with degrees-of-freedom adjustment
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2136990)