High-dimensional structure estimation in Ising models: local separation criterion
From MaRDI portal
Publication:693728
DOI10.1214/12-AOS1009zbMATH Open1297.62124arXiv1107.1736MaRDI QIDQ693728FDOQ693728
Authors: Animashree Anandkumar, Vincent Y. F. Tan, Furong Huang, Alan S. Willsky
Publication date: 10 December 2012
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We consider the problem of high-dimensional Ising (graphical) model selection. We propose a simple algorithm for structure estimation based on the thresholding of the empirical conditional variation distances. We introduce a novel criterion for tractable graph families, where this method is efficient, based on the presence of sparse local separators between node pairs in the underlying graph. For such graphs, the proposed algorithm has a sample complexity of , where is the number of variables, and is the minimum (absolute) edge potential in the model. We also establish nonasymptotic necessary and sufficient conditions for structure estimation.
Full work available at URL: https://arxiv.org/abs/1107.1736
Recommendations
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional Ising model selection with Bayesian information criteria
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
Cites Work
- Statistical mechanics of complex networks
- Markov chains and mixing times. With a chapter on ``Coupling from the past by James G. Propp and David B. Wilson.
- Graphical models, exponential families, and variational inference
- Elements of Information Theory
- High-dimensional graphs and variable selection with the Lasso
- Biological Sequence Analysis
- Title not available (Why is that?)
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Title not available (Why is that?)
- Approximating discrete probability distributions with dependence trees
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Rejoinder: Latent variable graphical model selection via convex optimization
- Collective dynamics of `small-world' networks
- Forest density estimation
- Title not available (Why is that?)
- Learning Bayesian networks from data: An information-theory based approach
- The Complexity of Distinguishing Markov Random Fields
- Complex graphs and networks
- Markov Chains
- Random graph models of social networks
- Complex social networks.
- Learning Markov networks: Maximum bounded tree-width graphs
- Short cycles in random regular graphs
- Mengerian theorems for paths of bounded length
- Diameter and treewidth in minor-closed graph families
- Ising models on power-law random graphs
- On the girth of random Cayley graphs
- Learning latent tree graphical models
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- A Large-Deviation Analysis of the Maximum-Likelihood Learning of Markov Tree Structures
- Title not available (Why is that?)
- Learning factor graphs in polynomial time and sample complexity
- Learning Gaussian Tree Models: Analysis of Error Exponents and Extremal Structures
Cited In (27)
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- A global approach for learning sparse Ising models
- AMP chain graphs: minimal separators and structure learning algorithms
- Causal Structural Learning via Local Graphs
- Bayesian model selection for high-dimensional Ising models, with applications to educational data
- Empirical comparison study of approximate methods for structure selection in binary graphical models
- Tensor recovery in high-dimensional Ising models
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- High-dimensional Ising model selection with Bayesian information criteria
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Universality of the mean-field for the Potts model
- Computational implications of reducing data to sufficient statistics
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Sparse model selection in the highly under-sampled regime
- Inequalities on partial correlations in Gaussian graphical models containing star shapes
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Updating of the Gaussian graphical model through targeted penalized estimation
- An expectation maximization algorithm for high-dimensional model selection for the Ising model with misclassified states*
- Learning a tree-structured Ising model in order to make predictions
- Joint estimation of parameters in Ising model
- High-dimensional structure estimation in Ising models: local separation criterion
- Nonconcave penalized composite conditional likelihood estimation of sparse Ising models
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Property testing in high-dimensional Ising models
- A decomposition-based algorithm for learning the structure of multivariate regression chain graphs
- Graphical Models and Message-Passing Algorithms: Some Introductory Lectures
Uses Software
This page was built for publication: High-dimensional structure estimation in Ising models: local separation criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q693728)