Overview of debiased Lasso in high-dimensional linear model
From MaRDI portal
Publication:6181971
Recommendations
- Debiasing the debiased Lasso with bootstrap
- On the asymptotic variance of the debiased Lasso
- Projection-based Inference for High-dimensional Linear Models
- Posterior asymptotic normality for an individual coordinate in high-dimensional linear regression
- Debiasing the Lasso: optimal sample size for Gaussian designs
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Best subset selection, persistence in high-dimensional statistical learning and optimization under l₁ constraint
- Comments on: \(\ell _{1}\)-penalization for mixture regression models
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Corrigendum in “Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise” [Mar 06 1030-1051]
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- High-dimensional simultaneous inference with the bootstrap
- Lasso-type recovery of sparse representations for high-dimensional data
- Nearly unbiased variable selection under minimax concave penalty
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the conditions used to prove oracle results for the Lasso
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Scaled sparse linear regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Statistics for high-dimensional data. Methods, theory and applications.
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
Cited in
(2)
This page was built for publication: Overview of debiased Lasso in high-dimensional linear model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6181971)