Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection

From MaRDI portal
Publication:5378251

DOI10.1162/NECO_A_00379zbMATH Open1418.62234arXiv1206.1275OpenAlexW2098589050WikidataQ44914079 ScholiaQ44914079MaRDI QIDQ5378251FDOQ5378251


Authors: Lingzhou Xue, Hui Zou, Shiqian Ma Edit this on Wikidata


Publication date: 12 June 2019

Published in: Neural Computation (Search for Journal in Brave)

Abstract: Chandrasekaran, Parrilo and Willsky (2010) proposed a convex optimization problem to characterize graphical model selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this paper, we propose two alternating direction methods for solving this problem. The first method is to apply the classical alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient based alternating direction method of multipliers. Our methods exploit and take advantage of the special structure of the problem and thus can solve large problems very efficiently. Global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with one million variables in one to two minutes, and are usually five to thirty five times faster than a state-of-the-art Newton-CG proximal point algorithm.


Full work available at URL: https://arxiv.org/abs/1206.1275




Recommendations




Cites Work


Cited In (30)

Uses Software





This page was built for publication: Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378251)