Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
From MaRDI portal
Abstract: Modern variable selection procedures make use of penalization methods to execute simultaneous model selection and estimation. A popular method is the LASSO (least absolute shrinkage and selection operator), the use of which requires selecting the value of a tuning parameter. This parameter is typically tuned by minimizing the cross-validation error or Bayesian information criterion (BIC) but this can be computationally intensive as it involves fitting an array of different models and selecting the best one. In contrast with this standard approach, we have developed a procedure based on the so-called "smooth IC" (SIC) in which the tuning parameter is automatically selected in one step. We also extend this model selection procedure to the distributional regression framework, which is more flexible than classical regression modelling. Distributional regression, also known as multiparameter regression (MPR), introduces flexibility by taking account of the effect of covariates through multiple distributional parameters simultaneously, e.g., mean and variance. These models are useful in the context of normal linear regression when the process under study exhibits heteroscedastic behaviour. Reformulating the distributional regression estimation problem in terms of penalized likelihood enables us to take advantage of the close relationship between model selection criteria and penalization. Utilizing the SIC is computationally advantageous, as it obviates the issue of having to choose multiple tuning parameters.
Recommendations
- Tuning parameter selector for the penalized likelihood method in multivariate generalized linear models
- Consistent selection of tuning parameters via variable selection stability
- Model selection with data-oriented penalty
- An Adaptive Method of Variable Selection in Regression
- Adaptive Model Selection
Cites work
- scientific article; zbMATH DE number 3841083 (Why is no real title available?)
- scientific article; zbMATH DE number 1034037 (Why is no real title available?)
- scientific article; zbMATH DE number 3998953 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- A globally convergent algorithm for Lasso-penalized mixture of linear regression models
- A new look at the statistical model identification
- A study of error variance estimation in Lasso regression
- A uniform framework for the combination of penalties in generalized structured models
- Adaptive robust variable selection
- BAMLSS: Bayesian Additive Models for Location, Scale, and Shape (and Beyond)
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Checking for Lack of Fit in Linear Models with Parametric Variance Functions
- Estimating Regression Models with Multiplicative Heteroscedasticity
- Estimating the dimension of a model
- GAMLSS: A distributional regression approach
- Generalized Additive Models for Location, Scale and Shape
- Hedonic housing prices and the demand for clean air
- Lasso-type penalization in the framework of generalized additive models for location, scale and shape
- Least angle regression. (With discussion)
- Multi-parameter regression survival modeling: an alternative to proportional hazards
- Pathwise coordinate optimization
- Regularization and Variable Selection Via the Elastic Net
- Robust statistics. Theory and methods (with R)
- Shrinkage tuning parameter selection with a diverging number of parameters
- Sparse Estimation of Generalized Linear Models (GLM) via Approximated Information Criteria
- Sparse regression with multi-type regularized feature modeling
- The Adaptive Lasso and Its Oracle Properties
- Unified LASSO Estimation by Least Squares Approximation
- Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection for Cox's proportional hazards model and frailty model
- Variable selection using MM algorithms
Cited in
(2)
This page was built for publication: Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q85096)