Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models
From MaRDI portal
Publication:391941
DOI10.1016/J.JMVA.2013.09.015zbMATH Open1278.62054arXiv1107.4861OpenAlexW2074929850MaRDI QIDQ391941FDOQ391941
Authors: Heng Lian
Publication date: 13 January 2014
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Abstract: For linear models with a diverging number of parameters, it has recently been shown that modified versions of Bayesian information criterion (BIC) can identify the true model consistently. However, in many cases there is little justification that the effects of the covariates are actually linear. Thus a semiparametric model such as the additive model studied here, is a viable alternative. We demonstrate that theoretical results on the consistency of BIC-type criterion can be extended to this more challenging situation, with dimension diverging exponentially fast with sample size. Besides, the noise assumptions are relaxed in our theoretical studies. These efforts significantly enlarge the applicability of the criterion to a more general class of models.
Full work available at URL: https://arxiv.org/abs/1107.4861
Recommendations
- High-dimensional Bayesian inference in nonparametric additive models
- Extended Bayesian information criteria for model selection with large model spaces
- Semiparametric model selection in large samples
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
Bayesian inference (62F15) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Extended Bayesian information criteria for model selection with large model spaces
- Title not available (Why is that?)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Some Comments on C P
- Penalized Spline Estimation for Partially Linear Single-Index Models
- Model Selection and Estimation in Regression with Grouped Variables
- A practical guide to splines.
- Additive regression and other nonparametric models
- Shrinkage tuning parameter selection with a diverging number of parameters
- Title not available (Why is that?)
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Shrinkage estimation of the varying coefficient model
- Variable selection in nonparametric additive models
- On Bayes procedures
- Statistical predictor identification
- Estimation and variable selection for generalized additive partial linear models
- Title not available (Why is that?)
Cited In (5)
- Extended Bayesian information criteria for model selection with large model spaces
- Variable selection for additive model via cumulative ratios of empirical strengths total
- High-dimensional Bayesian inference in nonparametric additive models
- Composite likelihood Bayesian information criteria for model selection in high-dimensional data
- Semiparametric partial linear modeling of risk factors for ear infections: the Early Childhood Longitudinal Study
This page was built for publication: Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q391941)