Shrinkage tuning parameter selection in precision matrices estimation
From MaRDI portal
Publication:538141
DOI10.1016/J.JSPI.2011.03.008zbMATH Open1213.62099arXiv0909.1123OpenAlexW1556261580MaRDI QIDQ538141FDOQ538141
Authors: Heng Lian
Publication date: 23 May 2011
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Abstract: Recent literature provides many computational and modeling approaches for covariance matrices estimation in a penalized Gaussian graphical models but relatively little study has been carried out on the choice of the tuning parameter. This paper tries to fill this gap by focusing on the problem of shrinkage parameter selection when estimating sparse precision matrices using the penalized likelihood approach. Previous approaches typically used K-fold cross-validation in this regard. In this paper, we first derived the generalized approximate cross-validation for tuning parameter selection which is not only a more computationally efficient alternative, but also achieves smaller error rate for model fitting compared to leave-one-out cross-validation. For consistency in the selection of nonzero entries in the precision matrix, we employ a Bayesian information criterion which provably can identify the nonzero conditional correlations in the Gaussian model. Our simulations demonstrate the general superiority of the two proposed selectors in comparison with leave-one-out cross-validation, ten-fold cross-validation and Akaike information criterion.
Full work available at URL: https://arxiv.org/abs/0909.1123
Recommendations
- Model selection and estimation in the Gaussian graphical model
- Selecting the tuning parameter in penalized Gaussian graphical models
- Covariance matrix selection and estimation via penalised normal likelihood
- Tuning-parameter selection in regularized estimations of large covariance matrices
- A computationally fast alternative to cross-validation in penalized Gaussian graphical models
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Asymptotic Statistics
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Pattern recognition and machine learning.
- Covariance regularization by thresholding
- Sparsistency and rates of convergence in large covariance matrix estimation
- Sparse inverse covariance estimation with the graphical lasso
- Network exploration via the adaptive LASSO and SCAD penalties
- Shrinkage tuning parameter selection with a diverging number of parameters
- Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks
- Model selection and estimation in the Gaussian graphical model
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Unified LASSO Estimation by Least Squares Approximation
- Title not available (Why is that?)
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Nonconcave penalized likelihood with a diverging number of parameters.
- Title not available (Why is that?)
Cited In (15)
- Ridge estimation of inverse covariance matrices from high-dimensional data
- Estimation of a sparse and spiked covariance matrix
- Selecting the tuning parameter in penalized Gaussian graphical models
- Title not available (Why is that?)
- High-dimensional missing data imputation via undirected graphical model
- A computationally fast alternative to cross-validation in penalized Gaussian graphical models
- Model selection and estimation in the Gaussian graphical model
- Selecting a shrinkage parameter in structural equation modeling with a near singular covariance matrix by the GIC minimization method
- Partial correlation graphical LASSO
- MARS as an alternative approach of Gaussian graphical model for biochemical networks
- Shrinking characteristics of precision matrix estimators
- Group-wise shrinkage estimation in penalized model-based clustering
- Shrinkage estimation of large dimensional precision matrix using random matrix theory
- The spectral condition number plot for regularization parameter evaluation
- Tuning-parameter selection in regularized estimations of large covariance matrices
Uses Software
This page was built for publication: Shrinkage tuning parameter selection in precision matrices estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q538141)