Component selection in the additive regression model

From MaRDI portal
Publication:2852624

DOI10.1111/J.1467-9469.2012.00823.XzbMATH Open1364.62091arXiv1101.0047OpenAlexW1938876420MaRDI QIDQ2852624FDOQ2852624


Authors: Xia Cui, Heng Peng, Songqiao Wen, Li-Xing Zhu Edit this on Wikidata


Publication date: 9 October 2013

Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)

Abstract: Similar to variable selection in the linear regression model, selecting significant components in the popular additive regression model is of great interest. However, such components are unknown smooth functions of independent variables, which are unobservable. As such, some approximation is needed. In this paper, we suggest a combination of penalized regression spline approximation and group variable selection, called the lasso-type spline method (LSM), to handle this component selection problem with a diverging number of strongly correlated variables in each group. It is shown that the proposed method can select significant components and estimate nonparametric additive function components simultaneously with an optimal convergence rate simultaneously. To make the LSM stable in computation and able to adapt its estimators to the level of smoothness of the component functions, weighted power spline bases and projected weighted power spline bases are proposed. Their performance is examined by simulation studies across two set-ups with independent predictors and correlated predictors, respectively, and appears superior to the performance of competing methods. The proposed method is extended to a partial linear regression model analysis with real data, and gives reliable results.


Full work available at URL: https://arxiv.org/abs/1101.0047




Recommendations




Cites Work


Cited In (17)

Uses Software





This page was built for publication: Component selection in the additive regression model

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2852624)