Sparse and efficient estimation for partial spline models with increasing dimension

From MaRDI portal
Publication:2255168

DOI10.1007/S10463-013-0440-YzbMATH Open1331.65028arXiv1310.8633OpenAlexW2112549647WikidataQ43111864 ScholiaQ43111864MaRDI QIDQ2255168FDOQ2255168


Authors: Guang Cheng, Hao Helen Zhang, Zuofeng Shang Edit this on Wikidata


Publication date: 6 February 2015

Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)

Abstract: We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.


Full work available at URL: https://arxiv.org/abs/1310.8633




Recommendations




Cites Work


Cited In (4)

Uses Software





This page was built for publication: Sparse and efficient estimation for partial spline models with increasing dimension

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2255168)