Abstract: Recent results have proven the minimax optimality of LASSO and related algorithms for noisy linear regression. However, these results tend to rely on variance estimators that are inefficient or optimizations that are slower than LASSO itself. We propose an efficient estimator for the noise variance in high dimensional linear regression that is faster than LASSO, only requiring matrix-vector multiplications. We prove this estimator is consistent with a good rate of convergence, under the condition that the design matrix satisfies the Restricted Isometry Property (RIP). In practice, our estimator scales incredibly well into high dimensions, is highly parallelizable, and only incurs a modest bias.
Recommendations
- Estimating the error variance in a high-dimensional linear model
- A study of error variance estimation in Lasso regression
- On the asymptotic variance of the debiased Lasso
- SOCP based variance free Dantzig selector with application to robust estimation
- High-dimensional regression with unknown variance
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A simple proof of the restricted isometry property for random matrices
- A study of error variance estimation in Lasso regression
- Adaptive estimation of a quadratic functional by model selection.
- Book Review: A mathematical introduction to compressive sensing
- Compressive sensing and structured random matrices
- Decoding by Linear Programming
- High-dimensional generalized linear models and the lasso
- How well can we estimate a sparse vector?
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Lasso-type recovery of sparse representations for high-dimensional data
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- On sparse reconstruction from Fourier and Gaussian measurements
- On the conditions used to prove oracle results for the Lasso
- Oracle inequalities and optimal inference under group sparsity
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Simultaneous analysis of Lasso and Dantzig selector
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Variance estimation in high-dimensional linear models
- Variance estimation using refitted cross-validation in ultrahigh dimensional regression
Cited in
(7)- Estimating the error variance in a high-dimensional linear model
- Stabilizing the Lasso against cross-validation variability
- Variance estimation in high-dimensional linear regression via adaptive elastic-net
- LASSO-TYPE GMM ESTIMATOR
- \(\ell_1\)-penalised ordinal polytomous regression estimators with application to gene expression studies
- A study of error variance estimation in Lasso regression
- Variant of Greedy Randomized Gauss-Seidel Method for Ridge Regression
This page was built for publication: Greedy variance estimation for the LASSO
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2019914)