Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression

From MaRDI portal
Publication:389956

DOI10.1214/14-EJS875zbMATH Open1281.62158arXiv1306.5505OpenAlexW2963705952MaRDI QIDQ389956FDOQ389956


Authors: Hanzhong Liu, Bin Yu Edit this on Wikidata


Publication date: 22 January 2014

Published in: Electronic Journal of Statistics (Search for Journal in Brave)

Abstract: We study the asymptotic properties of Lasso+mLS and Lasso+Ridge under the sparse high-dimensional linear regression model: Lasso selecting predictors and then modified Least Squares (mLS) or Ridge estimating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+mLS and Lasso+Ridge. Second, we derive the asymptotic unbiasedness of Lasso+mLS and Lasso+Ridge. More specifically, we show that their biases decay at an exponential rate and they can achieve the oracle convergence rate of s/n (where s is the number of nonzero regression coefficients and n is the sample size) for mean squared error (MSE). Third, we show that Lasso+mLS and Lasso+Ridge are asymptotically normal. They have an oracle property in the sense that they can select the true predictors with probability converging to 1 and the estimates of nonzero parameters have the same asymptotic normal distribution that they would have if the zero parameters were known in advance. In fact, our analysis is not limited to adopting Lasso in the selection stage, but is applicable to any other model selection criteria with exponentially decay rates of the probability of selecting wrong models.


Full work available at URL: https://arxiv.org/abs/1306.5505




Recommendations




Cites Work


Cited In (26)

Uses Software





This page was built for publication: Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q389956)