Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
From MaRDI portal
Publication:2125484
Abstract: This paper considers a high-dimensional linear regression problem where there are complex correlation structures among predictors. We propose a graph-constrained regularization procedure, named Sparse Laplacian Shrinkage with the Graphical Lasso Estimator (SLS-GLE). The procedure uses the estimated precision matrix to describe the specific information on the conditional dependence pattern among predictors, and encourages both sparsity on the regression model and the graphical model. We introduce the Laplacian quadratic penalty adopting the graph information, and give detailed discussions on the advantages of using the precision matrix to construct the Laplacian matrix. Theoretical properties and numerical comparisons are presented to show that the proposed method improves both model interpretability and accuracy of estimation. We also apply this method to a financial problem and prove that the proposed procedure is successful in assets selection.
Recommendations
- The sparse Laplacian shrinkage estimator for high-dimensional regression
- Sparse inverse covariance estimation with the graphical lasso
- Bayesian regularization via graph Laplacian
- High-dimensional graphs and variable selection with the Lasso
- Bayesian regularization for graphical models with unequal shrinkage
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 964896 (Why is no real title available?)
- A General Framework for Weighted Gene Co-Expression Network Analysis
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Lasso-type recovery of sparse representations for high-dimensional data
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection and estimation in the Gaussian graphical model
- Nearly unbiased variable selection under minimax concave penalty
- Nonconcave penalized likelihood with a diverging number of parameters.
- Nonnegative-Lasso and application in index tracking
- Pathwise coordinate optimization
- Regularization and Variable Selection Via the Elastic Net
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Shrinkage and model selection with correlated variables via weighted fusion
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Sparse regression with exact clustering
- SparseNet: coordinate descent with nonconvex penalties
- Sparsity and Smoothness Via the Fused Lasso
- The Mnet method for variable selection
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- The sparse Laplacian shrinkage estimator for high-dimensional regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection and regression analysis for graph-structured covariates with an application to genomics
Cited in
(8)- Graph-based regularization for regression problems with alignment and highly correlated designs
- A significance test for graph-constrained estimation
- Doubly sparse regression incorporating graphical structure among predictors
- Coordinate descent algorithm for covariance graphical Lasso
- Graphical-model based high dimensional generalized linear models
- Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures
- Graphical lassos for meta‐elliptical distributions
- The sparse Laplacian shrinkage estimator for high-dimensional regression
This page was built for publication: Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2125484)