Learning loopy graphical models with latent variables: efficient methods and guarantees
From MaRDI portal
(Redirected from Publication:355078)
Abstract: The problem of structure estimation in graphical models with latent variables is considered. We characterize conditions for tractable graph estimation and develop efficient methods with provable guarantees. We consider models where the underlying Markov graph is locally tree-like, and the model is in the regime of correlation decay. For the special case of the Ising model, the number of samples required for structural consistency of our method scales as , where p is the number of variables, is the minimum edge potential, is the depth (i.e., distance from a hidden node to the nearest observed nodes), and is a parameter which depends on the bounds on node and edge potentials in the Ising model. Necessary conditions for structural consistency under any algorithm are derived and our method nearly matches the lower bound on sample requirements. Further, the proposed method is practical to implement and provides flexibility to control the number of latent variables and the cycle lengths in the output graph.
Recommendations
- Graphical model selection with latent variables
- Graphical model selection for Gaussian conditional random fields in the presence of latent variables
- Latent variable graphical model selection via convex optimization
- Learning loosely connected Markov random fields
- Learning latent tree graphical models
Cites work
- scientific article; zbMATH DE number 5957255 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 2038845 (Why is no real title available?)
- scientific article; zbMATH DE number 1865935 (Why is no real title available?)
- scientific article; zbMATH DE number 3290846 (Why is no real title available?)
- scientific article; zbMATH DE number 964896 (Why is no real title available?)
- 10.1162/jmlr.2003.3.4-5.993
- A few logs suffice to build (almost) all trees (I)
- Biological Sequence Analysis
- Combinatorial criteria for uniqueness of Gibbs measures
- Estimating the dimension of a model
- Gibbs measures and phase transitions
- Girth and treewidth
- Hierarchical latent class models for cluster analysis
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional structure estimation in Ising models: local separation criterion
- Inexact graph matching for structural pattern recognition
- Information, Physics, and Computation
- Ising models on locally tree-like graphs
- Latent variable graphical model selection via convex optimization
- Learning Markov networks: Maximum bounded tree-width graphs
- Learning latent tree graphical models
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- On the girth of random Cayley graphs
- Optimal phylogenetic reconstruction
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Recovering a tree from the leaf colourations it generates under a Markov model
- Rejoinder: Latent variable graphical model selection via convex optimization
- The Complexity of Distinguishing Markov Random Fields
Cited in
(9)- Learning latent tree graphical models
- Discovering the topology of complex networks via adaptive estimators
- Learning factor graphs in polynomial time and sample complexity
- scientific article; zbMATH DE number 7370576 (Why is no real title available?)
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Efficient learning of discrete graphical models*
- Latent variable graphical model selection via convex optimization
- Learning loosely connected Markov random fields
- Learning a tree-structured Ising model in order to make predictions
This page was built for publication: Learning loopy graphical models with latent variables: efficient methods and guarantees
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q355078)