Rank reduction for high-dimensional generalized additive models
From MaRDI portal
Publication:2274971
DOI10.1016/j.jmva.2019.05.005zbMath1422.62130OpenAlexW2948527995MaRDI QIDQ2274971
Hua Liang, Heng Lian, Hongmei Lin
Publication date: 1 October 2019
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2019.05.005
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Incremental proximal methods for large scale convex optimization
- Variable selection in nonparametric additive models
- High-dimensional additive modeling
- Local asymptotics for polynomial spline regression
- Local asymptotics for regression splines and confidence regions
- Estimation and model selection in generalized additive partial linear models for correlated data with diverging number of covariates
- Extended BIC for small-n-large-P sparse GLM
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Extended Bayesian information criteria for model selection with large model spaces
- Sliced Inverse Regression for Dimension Reduction
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Additive Models
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models
- GENERALIZED ADDITIVE PARTIAL LINEAR MODELS WITH HIGH-DIMENSIONAL COVARIATES
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Sufficient Dimension Reduction via Inverse Regression
- A practical guide to splines.
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
This page was built for publication: Rank reduction for high-dimensional generalized additive models