Greedy variance estimation for the LASSO
From MaRDI portal
Publication:2019914
DOI10.1007/S00245-019-09561-6zbMATH Open1464.62348arXiv1803.10878OpenAlexW2963160388WikidataQ128217785 ScholiaQ128217785MaRDI QIDQ2019914FDOQ2019914
Authors: Christopher A. Kennedy, Rachel Ward
Publication date: 22 April 2021
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Abstract: Recent results have proven the minimax optimality of LASSO and related algorithms for noisy linear regression. However, these results tend to rely on variance estimators that are inefficient or optimizations that are slower than LASSO itself. We propose an efficient estimator for the noise variance in high dimensional linear regression that is faster than LASSO, only requiring matrix-vector multiplications. We prove this estimator is consistent with a good rate of convergence, under the condition that the design matrix satisfies the Restricted Isometry Property (RIP). In practice, our estimator scales incredibly well into high dimensions, is highly parallelizable, and only incurs a modest bias.
Full work available at URL: https://arxiv.org/abs/1803.10878
Recommendations
- Estimating the error variance in a high-dimensional linear model
- A study of error variance estimation in Lasso regression
- On the asymptotic variance of the debiased Lasso
- SOCP based variance free Dantzig selector with application to robust estimation
- High-dimensional regression with unknown variance
Ridge regression; shrinkage estimators (Lasso) (62J07) Minimax procedures in statistical decision theory (62C20)
Cites Work
- Title not available (Why is that?)
- Lasso-type recovery of sparse representations for high-dimensional data
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A study of error variance estimation in Lasso regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Decoding by Linear Programming
- A simple proof of the restricted isometry property for random matrices
- Adaptive estimation of a quadratic functional by model selection.
- Oracle inequalities and optimal inference under group sparsity
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- On sparse reconstruction from Fourier and Gaussian measurements
- How well can we estimate a sparse vector?
- Variance estimation in high-dimensional linear models
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- Compressive sensing and structured random matrices
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Book Review: A mathematical introduction to compressive sensing
Cited In (7)
- LASSO-TYPE GMM ESTIMATOR
- Stabilizing the Lasso against cross-validation variability
- A study of error variance estimation in Lasso regression
- \(\ell_1\)-penalised ordinal polytomous regression estimators with application to gene expression studies
- Variance estimation in high-dimensional linear regression via adaptive elastic-net
- Variant of Greedy Randomized Gauss-Seidel Method for Ridge Regression
- Estimating the error variance in a high-dimensional linear model
Uses Software
This page was built for publication: Greedy variance estimation for the LASSO
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2019914)