Learning loopy graphical models with latent variables: efficient methods and guarantees
From MaRDI portal
Publication:355078
DOI10.1214/12-AOS1070zbMATH Open1267.62070arXiv1203.3887OpenAlexW1994400630MaRDI QIDQ355078FDOQ355078
Authors: Animashree Anandkumar, Ragupathyraj Valluvan
Publication date: 24 July 2013
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: The problem of structure estimation in graphical models with latent variables is considered. We characterize conditions for tractable graph estimation and develop efficient methods with provable guarantees. We consider models where the underlying Markov graph is locally tree-like, and the model is in the regime of correlation decay. For the special case of the Ising model, the number of samples required for structural consistency of our method scales as , where p is the number of variables, is the minimum edge potential, is the depth (i.e., distance from a hidden node to the nearest observed nodes), and is a parameter which depends on the bounds on node and edge potentials in the Ising model. Necessary conditions for structural consistency under any algorithm are derived and our method nearly matches the lower bound on sample requirements. Further, the proposed method is practical to implement and provides flexibility to control the number of latent variables and the cycle lengths in the output graph.
Full work available at URL: https://arxiv.org/abs/1203.3887
Recommendations
- Graphical model selection with latent variables
- Graphical model selection for Gaussian conditional random fields in the presence of latent variables
- Latent variable graphical model selection via convex optimization
- Learning loosely connected Markov random fields
- Learning latent tree graphical models
Monte Carlo methods (65C05) Applications of graph theory (05C90) Estimation in multivariate analysis (62H12) Distance in graphs (05C12)
Cites Work
- Estimating the dimension of a model
- High-dimensional graphs and variable selection with the Lasso
- Biological Sequence Analysis
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Latent variable graphical model selection via convex optimization
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- 10.1162/jmlr.2003.3.4-5.993
- Rejoinder: Latent variable graphical model selection via convex optimization
- Title not available (Why is that?)
- The Complexity of Distinguishing Markov Random Fields
- Ising models on locally tree-like graphs
- Gibbs measures and phase transitions
- Learning Markov networks: Maximum bounded tree-width graphs
- Information, Physics, and Computation
- Recovering a tree from the leaf colourations it generates under a Markov model
- Optimal phylogenetic reconstruction
- On the girth of random Cayley graphs
- Hierarchical latent class models for cluster analysis
- Title not available (Why is that?)
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- A few logs suffice to build (almost) all trees (I)
- Title not available (Why is that?)
- Inexact graph matching for structural pattern recognition
- Learning latent tree graphical models
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Combinatorial criteria for uniqueness of Gibbs measures
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Girth and treewidth
Cited In (8)
- Learning latent tree graphical models
- Discovering the topology of complex networks via adaptive estimators
- Learning factor graphs in polynomial time and sample complexity
- Title not available (Why is that?)
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Learning loosely connected Markov random fields
- Latent variable graphical model selection via convex optimization
- Learning a tree-structured Ising model in order to make predictions
Uses Software
This page was built for publication: Learning loopy graphical models with latent variables: efficient methods and guarantees
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q355078)