On the adaptive elastic net with a diverging number of parameters

From MaRDI portal
Publication:2388979

DOI10.1214/08-AOS625zbMATH Open1168.62064arXiv0908.1836WikidataQ40349223 ScholiaQ40349223MaRDI QIDQ2388979FDOQ2388979


Authors: Yanyan Li Edit this on Wikidata


Publication date: 22 July 2009

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348--1360] and [Ann. Statist. 32 (2004) 928--961] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.


Full work available at URL: https://arxiv.org/abs/0908.1836




Recommendations




Cites Work


Cited In (only showing first 100 items - show all)





This page was built for publication: On the adaptive elastic net with a diverging number of parameters

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2388979)