De-biasing the Lasso with degrees-of-freedom adjustment
From MaRDI portal
Publication:2136990
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 490141 (Why is no real title available?)
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Asymptotic Statistics
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Confidence intervals for low dimensional parameters in high dimensional linear models
- De-biasing the Lasso with degrees-of-freedom adjustment
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Decoding by Linear Programming
- Degrees of freedom in lasso problems
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Estimation of sums of random variables: examples and information bounds
- High-dimensional graphs and variable selection with the Lasso
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Inference on treatment effects after selection among high-dimensional controls
- Just relax: convex programming methods for identifying sparse signals in noise
- Least squares after model selection in high-dimensional sparse models
- Linear hypothesis testing in dense high-dimensional linear models
- Nearly unbiased variable selection under minimax concave penalty
- On asymptotically efficient estimation in semiparametric models
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the ``degrees of freedom of the lasso
- Pivotal estimation via square-root lasso in nonparametric regression
- Scaled sparse linear regression
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Significance testing in non-sparse high-dimensional linear models
- Simultaneous analysis of Lasso and Dantzig selector
- Statistical significance in high-dimensional linear models
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
Cited in
(8)- De-Biasing The Lasso With Degrees-of-Freedom Adjustment
- Debiased lasso for generalized linear models with a diverging number of covariates
- The Lasso with general Gaussian designs with applications to hypothesis testing
- Spatially relaxed inference on high-dimensional linear models
- Asymptotic normality of robust \(M\)-estimators with convex penalty
- De-biasing the Lasso with degrees-of-freedom adjustment
- Universality of regularized regression estimators in high dimensions
- Debiasing convex regularized estimators and interval estimation in linear models
This page was built for publication: De-biasing the Lasso with degrees-of-freedom adjustment
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2136990)