The predictive Lasso

From MaRDI portal
Publication:693339

DOI10.1007/S11222-011-9279-3zbMATH Open1252.62075arXiv1009.2302OpenAlexW1964804880MaRDI QIDQ693339FDOQ693339


Authors: Minh-Ngoc Tran, David J. Nott, Chenlei Leng Edit this on Wikidata


Publication date: 7 December 2012

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an l1 constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original lasso problem for GLMs, our procedure can benefit from available l1-regularization path algorithms. Simulation studies and real-data examples confirm the efficiency of our method in terms of predictive performance on future observations.


Full work available at URL: https://arxiv.org/abs/1009.2302




Recommendations




Cites Work


Cited In (14)





This page was built for publication: The predictive Lasso

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q693339)