Sparse regression with multi-type regularized feature modeling
From MaRDI portal
Publication:2657005
Abstract: Within the statistical and machine learning literature, regularization techniques are often used to construct sparse (predictive) models. Most regularization strategies only work for data where all predictors are treated identically, such as Lasso regression for (continuous) predictors treated as linear effects. However, many predictive problems involve different types of predictors and require a tailored regularization term. We propose a multi-type Lasso penalty that acts on the objective function as a sum of subpenalties, one for each type of predictor. As such, we allow for predictor selection and level fusion within a predictor in a data-driven way, simultaneous with the parameter estimation process. We develop a new estimation strategy for convex predictive models with this multi-type penalty. Using the theory of proximal operators, our estimation procedure is computationally efficient, partitioning the overall optimization problem into easier to solve subproblems, specific for each predictor type and its associated penalty. Earlier research applies approximations to non-differentiable penalties to solve the optimization problem. The proposed SMuRF algorithm removes the need for approximations and achieves a higher accuracy and computational efficiency. This is demonstrated with an extensive simulation study and the analysis of a case-study on insurance pricing analytics.
Recommendations
- Adaptive regularization using the entire solution surface
- Smoothing proximal gradient method for general structured sparse regression
- Sparsity and Smoothness Via the Fused Lasso
- Variable selection in multivariate linear models for functional data via sparse regularization
- A sparse regularization approach with Log type penalty
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3574917 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- scientific article; zbMATH DE number 6734253 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A data driven binning strategy for the construction of insurance tariff classes
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A new look at the statistical model identification
- A note on adaptive group Lasso
- A uniform framework for the combination of penalties in generalized structured models
- Coherent dispersion criteria for optimal experimental design
- Estimating the dimension of a model
- Flexible smoothing with \(B\)-splines and penalties. With comments and a rejoinder by the authors
- Generalized additive models
- Least angle regression. (With discussion)
- Model Selection and Estimation in Regression with Grouped Variables
- Non-life rate-making with Bayesian GAMs
- Nonlife ratemaking and risk management with Bayesian generalized additive models for location, scale, and shape
- On the robustness of the generalized fused Lasso to prior specifications
- Properties and refinements of the fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- Relaxed Lasso
- Restricted Estimation of Generalized Linear Models
- Simultaneous Factor Selection and Collapsing Levels in ANOVA
- Sparse modeling of categorial explanatory variables
- Sparsity and Smoothness Via the Fused Lasso
- The Adaptive Lasso and Its Oracle Properties
- The solution path of the generalized lasso
Cited in
(11)- A non-convex regularization approach for stable estimation of loss development factors
- A group regularisation approach for constructing generalised age-period-cohort mortality projection models
- Identifying the determinants of lapse rates in life insurance: an automated Lasso approach
- REC: fast sparse regression-based multicategory classification
- Generalized fused Lasso for grouped data in generalized linear models
- Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
- Multi-state modelling of customer churn
- Loss amount prediction from textual data using a double GLM with shrinkage and selection
- Mixture Composite Regression Models with Multi-type Feature Selection
- Insurance pricing with hierarchically structured data an illustration with a workers' compensation insurance portfolio
- Efficient path algorithms for clustered Lasso and OSCAR
This page was built for publication: Sparse regression with multi-type regularized feature modeling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2657005)