Square-root lasso: pivotal recovery of sparse signals via conic programming

From MaRDI portal
Publication:3107973




Abstract: We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors p is large, possibly much larger than n, but only s regressors are significant. The method is a modification of the lasso, called the square-root lasso. The method is pivotal in that it neither relies on the knowledge of the standard deviation sigma or nor does it need to pre-estimate sigma. Moreover, the method does not rely on normality or sub-Gaussianity of noise. It achieves near-oracle performance, attaining the convergence rate sigma(s/n)logp1/2 in the prediction norm, and thus matching the performance of the lasso with known sigma. These performance results are valid for both Gaussian and non-Gaussian errors, under some mild moment restrictions. We formulate the square-root lasso as a solution to a convex conic programming problem, which allows us to implement the estimator using efficient algorithmic methods, such as interior-point and first-order methods.




Cited in
(only showing first 100 items - show all)






This page was built for publication: Square-root lasso: pivotal recovery of sparse signals via conic programming

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3107973)