Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
From MaRDI portal
Publication:2125484
DOI10.1007/S11749-021-00779-7zbMATH Open1484.62096arXiv1904.04664OpenAlexW3172378359MaRDI QIDQ2125484FDOQ2125484
Authors: Yanyan Li
Publication date: 14 April 2022
Published in: Test (Search for Journal in Brave)
Abstract: This paper considers a high-dimensional linear regression problem where there are complex correlation structures among predictors. We propose a graph-constrained regularization procedure, named Sparse Laplacian Shrinkage with the Graphical Lasso Estimator (SLS-GLE). The procedure uses the estimated precision matrix to describe the specific information on the conditional dependence pattern among predictors, and encourages both sparsity on the regression model and the graphical model. We introduce the Laplacian quadratic penalty adopting the graph information, and give detailed discussions on the advantages of using the precision matrix to construct the Laplacian matrix. Theoretical properties and numerical comparisons are presented to show that the proposed method improves both model interpretability and accuracy of estimation. We also apply this method to a financial problem and prove that the proposed procedure is successful in assets selection.
Full work available at URL: https://arxiv.org/abs/1904.04664
Recommendations
- The sparse Laplacian shrinkage estimator for high-dimensional regression
- Sparse inverse covariance estimation with the graphical lasso
- Bayesian regularization via graph Laplacian
- High-dimensional graphs and variable selection with the Lasso
- Bayesian regularization for graphical models with unequal shrinkage
Asymptotic properties of parametric estimators (62F12) Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- SparseNet: coordinate descent with nonconvex penalties
- Pathwise coordinate optimization
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- Title not available (Why is that?)
- Sparsity and Smoothness Via the Fused Lasso
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Title not available (Why is that?)
- Variable selection and regression analysis for graph-structured covariates with an application to genomics
- The sparse Laplacian shrinkage estimator for high-dimensional regression
- Sparse inverse covariance estimation with the graphical lasso
- Model selection and estimation in the Gaussian graphical model
- Nonconcave penalized likelihood with a diverging number of parameters.
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Sparse regression with exact clustering
- Nonnegative-Lasso and application in index tracking
- A General Framework for Weighted Gene Co-Expression Network Analysis
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- The Mnet method for variable selection
- Shrinkage and model selection with correlated variables via weighted fusion
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
Cited In (4)
Uses Software
This page was built for publication: Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2125484)