Sparse and efficient estimation for partial spline models with increasing dimension
From MaRDI portal
Publication:2255168
Abstract: We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.
Recommendations
- scientific article; zbMATH DE number 6951486
- Estimation and variable selection for generalized additive partial linear models
- Spline estimator for ultra-high dimensional partially linear varying coefficient models
- Automatic model selection for partially linear models
- Variable selection for the partial linear single-index model
Cites work
- scientific article; zbMATH DE number 4011660 (Why is no real title available?)
- scientific article; zbMATH DE number 4098524 (Why is no real title available?)
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 700016 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3273551 (Why is no real title available?)
- Adaptive Lasso for Cox's proportional hazards model
- Adaptive Lasso for sparse high-dimensional regression models
- An elementary estimator of the partial linear model
- Asymptotic behavior of M-estimators of p regression parameters when \(p^ 2/n\) is large. I. Consistency
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Automatic model selection for partially linear models
- Better Subset Regression Using the Nonnegative Garrote
- Convergence rates for partially splined models
- Least angle regression. (With discussion)
- Local and global asymptotic inference in smoothing spline models
- Mathematical Statistics
- New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the adaptive elastic net with a diverging number of parameters
- Penalized quasi-likelihood estimation in partial linear models
- Semiparametric Regression
- Shrinkage tuning parameter selection with a diverging number of parameters
- Simultaneous analysis of Lasso and Dantzig selector
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Smoothing spline ANOVA models
- Some results on Tchebycheffian spline functions and stochastic processes
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The Adaptive Lasso and Its Oracle Properties
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Weak convergence and empirical processes. With applications to statistics
Cited in
(5)- More efficient approximation of smoothing splines via space-filling basis selection
- Semiparametric efficient estimation in high-dimensional partial linear regression models
- A partially linear framework for massive heterogeneous data
- An RKHS-based approach to double-penalized regression in high-dimensional partially linear models
- Minimax optimal estimation in partially linear additive models under high dimension
This page was built for publication: Sparse and efficient estimation for partial spline models with increasing dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2255168)