A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees
From MaRDI portal
Publication:5378137
Abstract: Sparse high dimensional graphical model selection is a topic of much interest in modern day statistics. A popular approach is to apply l1-penalties to either (1) parametric likelihoods, or, (2) regularized regression/pseudo-likelihoods, with the latter having the distinct advantage that they do not explicitly assume Gaussianity. As none of the popular methods proposed for solving pseudo-likelihood based objective functions have provable convergence guarantees, it is not clear if corresponding estimators exist or are even computable, or if they actually yield correct partial correlation graphs. This paper proposes a new pseudo-likelihood based graphical model selection method that aims to overcome some of the shortcomings of current methods, but at the same time retain all their respective strengths. In particular, we introduce a novel framework that leads to a convex formulation of the partial covariance regression graph problem, resulting in an objective function comprised of quadratic forms. The objective is then optimized via a coordinate-wise approach. The specific functional form of the objective function facilitates rigorous convergence analysis leading to convergence guarantees; an important property that cannot be established using standard results, when the dimension is larger than the sample size, as is often the case in high dimensional applications. These convergence guarantees ensure that estimators are well-defined under very general conditions, and are always computable. In addition, the approach yields estimators that have good large sample properties and also respect symmetry. Furthermore, application to simulated/real data, timing comparisons and numerical convergence is demonstrated. We also present a novel unifying framework that places all graphical pseudo-likelihood methods as special cases of a more general formulation, leading to important insights.
Recommendations
- High-dimensional covariance estimation based on Gaussian graphical models
- Model selection and estimation in the Gaussian graphical model
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Sparsistency and rates of convergence in large covariance matrix estimation
- Concave penalized estimation of sparse Gaussian Bayesian networks
Cited in
(37)- High-dimensional correlation matrix estimation for general continuous data with Bagging technique
- Bayesian regularization of Gaussian graphical models with measurement error
- Edge selection for undirected graphs
- A scalable sparse Cholesky based approach for learning high-dimensional covariance matrices in ordered data
- Modeling correlated marker effects in genome-wide prediction via Gaussian concentration graph models
- Generalized score matching for non-negative data
- Bayesian regularization for graphical models with unequal shrinkage
- High-dimensional Markowitz portfolio optimization problem: empirical comparison of covariance matrix estimators
- An efficient parallel block coordinate descent algorithm for large-scale precision matrix estimation using graphics processing units
- Response variable selection in multivariate linear regression
- Kronecker-structured covariance models for multiway data
- Analysis of air quality time series of Hong Kong with graphical modeling
- Estimation of graphical models: an overview of selected topics
- Envelope-based partial partial least squares with application to cytokine-based biomarker analysis for COVID-19
- A generalized likelihood-based Bayesian approach for scalable joint regression and covariance selection in high dimensions
- Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation
- Bayesian discriminant analysis using a high dimensional predictor
- A Bayesian Subset Specific Approach to Joint Selection of Multiple Graphical Models
- Covariance structure estimation with Laplace approximation
- Consistent skinny Gibbs in probit regression
- Optimal estimation of a large-dimensional covariance matrix under Stein's loss
- Some aspects of response variable selection and estimation in multivariate linear regression
- Envelope-based sparse partial least squares
- Positive-definite modification of a covariance matrix by minimizing the matrix \(\ell_{\infty}\) norm with applications to portfolio optimization
- Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage
- Multivariate Gaussian network structure learning
- Statistical inference via conditional Bayesian posteriors in high-dimensional linear regression
- Estimation of Gaussian directed acyclic graphs using partial ordering information with applications to DREAM3 networks and dairy cattle data
- An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled Lasso
- Inferring large graphs using \(\ell_1\)-penalized likelihood
- Robust and sparse Gaussian graphical modelling under cell-wise contamination
- Assisted graphical model for gene expression data analysis
- Learning Gaussian graphical models with latent confounders
- Loss function, unbiasedness, and optimality of Gaussian graphical model selection
- Development of network-guided transcriptomic risk score for disease prediction
- Conditional score matching for high-dimensional partial graphical models
- Graph-guided banding of the covariance matrix
This page was built for publication: A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378137)