The predictive Lasso
From MaRDI portal
Publication:693339
Abstract: We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original lasso problem for GLMs, our procedure can benefit from available -regularization path algorithms. Simulation studies and real-data examples confirm the efficiency of our method in terms of predictive performance on future observations.
Recommendations
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 735230 (Why is no real title available?)
- scientific article; zbMATH DE number 1932867 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3249515 (Why is no real title available?)
- A criterion for optimal predictive model selection
- A weakly informative default prior distribution for logistic and other regression models
- Bayes Model Averaging with Selection of Regressors
- Bayesian Model Assessment and Comparison Using Cross-Validation Predictive Densities
- Bayesian Model Averaging for Linear Regression Models
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Bayesian projection approaches to variable selection in generalized linear models
- Goodness of prediction fit
- Model Selection and Multimodel Inference
- Regularization and Variable Selection Via the Elastic Net
- Strictly Proper Scoring Rules, Prediction, and Estimation
- The Adaptive Lasso and Its Oracle Properties
- The Bayesian Lasso
- Variable selection in qualitative models via an entropic explanatory power
Cited in
(17)- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Projective inference in high-dimensional problems: prediction and feature selection
- Comparison of Bayesian predictive methods for model selection
- Post model-fitting exploration via a ``next-door analysis
- Variable selection in generalized linear models.
- Fast, Optimal, and Targeted Predictions Using Parameterized Decision Analysis
- A fully Bayesian sparse polynomial chaos expansion approach with joint priors on the coefficients and global selection of terms
- Improving Lasso for model selection and prediction
- Lasso–type and Heuristic Strategies in Model Selection and Forecasting
- scientific article; zbMATH DE number 5957506 (Why is no real title available?)
- A survey of Bayesian predictive methods for model assessment, selection and comparison
- Prediction weighted maximum frequency selection
- Variable selection using Kullback-Leibler divergence loss
- Improved variable selection with forward-lasso adaptive shrinkage
- Estimation, prediction and inference for the LASSO random effects model
- IPF-LASSO: integrative \(L_1\)-penalized regression with penalty factors for prediction based on multi-omics data
- Prediction Error Property of the Lasso Estimator and its Generalization
This page was built for publication: The predictive Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q693339)