Shrinkage tuning parameter selection in precision matrices estimation
From MaRDI portal
Publication:538141
Abstract: Recent literature provides many computational and modeling approaches for covariance matrices estimation in a penalized Gaussian graphical models but relatively little study has been carried out on the choice of the tuning parameter. This paper tries to fill this gap by focusing on the problem of shrinkage parameter selection when estimating sparse precision matrices using the penalized likelihood approach. Previous approaches typically used K-fold cross-validation in this regard. In this paper, we first derived the generalized approximate cross-validation for tuning parameter selection which is not only a more computationally efficient alternative, but also achieves smaller error rate for model fitting compared to leave-one-out cross-validation. For consistency in the selection of nonzero entries in the precision matrix, we employ a Bayesian information criterion which provably can identify the nonzero conditional correlations in the Gaussian model. Our simulations demonstrate the general superiority of the two proposed selectors in comparison with leave-one-out cross-validation, ten-fold cross-validation and Akaike information criterion.
Recommendations
- Model selection and estimation in the Gaussian graphical model
- Selecting the tuning parameter in penalized Gaussian graphical models
- Covariance matrix selection and estimation via penalised normal likelihood
- Tuning-parameter selection in regularized estimations of large covariance matrices
- A computationally fast alternative to cross-validation in penalized Gaussian graphical models
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 1034037 (Why is no real title available?)
- scientific article; zbMATH DE number 932629 (Why is no real title available?)
- Asymptotic Statistics
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Covariance regularization by thresholding
- Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks
- High-dimensional graphs and variable selection with the Lasso
- Model selection and estimation in the Gaussian graphical model
- Network exploration via the adaptive LASSO and SCAD penalties
- Nonconcave penalized likelihood with a diverging number of parameters.
- Pattern recognition and machine learning.
- Shrinkage tuning parameter selection with a diverging number of parameters
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Sparse inverse covariance estimation with the graphical lasso
- Sparsistency and rates of convergence in large covariance matrix estimation
- The Adaptive Lasso and Its Oracle Properties
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Unified LASSO Estimation by Least Squares Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(15)- Ridge estimation of inverse covariance matrices from high-dimensional data
- Estimation of a sparse and spiked covariance matrix
- Selecting the tuning parameter in penalized Gaussian graphical models
- scientific article; zbMATH DE number 7168259 (Why is no real title available?)
- A computationally fast alternative to cross-validation in penalized Gaussian graphical models
- High-dimensional missing data imputation via undirected graphical model
- Model selection and estimation in the Gaussian graphical model
- Selecting a shrinkage parameter in structural equation modeling with a near singular covariance matrix by the GIC minimization method
- Partial correlation graphical LASSO
- MARS as an alternative approach of Gaussian graphical model for biochemical networks
- Shrinking characteristics of precision matrix estimators
- Group-wise shrinkage estimation in penalized model-based clustering
- The spectral condition number plot for regularization parameter evaluation
- Shrinkage estimation of large dimensional precision matrix using random matrix theory
- Tuning-parameter selection in regularized estimations of large covariance matrices
This page was built for publication: Shrinkage tuning parameter selection in precision matrices estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q538141)