High-dimensional structure estimation in Ising models: local separation criterion
From MaRDI portal
(Redirected from Publication:693728)
Abstract: We consider the problem of high-dimensional Ising (graphical) model selection. We propose a simple algorithm for structure estimation based on the thresholding of the empirical conditional variation distances. We introduce a novel criterion for tractable graph families, where this method is efficient, based on the presence of sparse local separators between node pairs in the underlying graph. For such graphs, the proposed algorithm has a sample complexity of , where is the number of variables, and is the minimum (absolute) edge potential in the model. We also establish nonasymptotic necessary and sufficient conditions for structure estimation.
Recommendations
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional Ising model selection with Bayesian information criteria
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
Cites work
- scientific article; zbMATH DE number 3904630 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 6253918 (Why is no real title available?)
- scientific article; zbMATH DE number 964896 (Why is no real title available?)
- A Large-Deviation Analysis of the Maximum-Likelihood Learning of Markov Tree Structures
- Approximating discrete probability distributions with dependence trees
- Biological Sequence Analysis
- Collective dynamics of `small-world' networks
- Complex graphs and networks
- Complex social networks.
- Diameter and treewidth in minor-closed graph families
- Elements of Information Theory
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Forest density estimation
- Graphical models, exponential families, and variational inference
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional structure estimation in Ising models: local separation criterion
- Ising models on power-law random graphs
- Learning Bayesian networks from data: An information-theory based approach
- Learning Gaussian Tree Models: Analysis of Error Exponents and Extremal Structures
- Learning Markov networks: Maximum bounded tree-width graphs
- Learning factor graphs in polynomial time and sample complexity
- Learning latent tree graphical models
- Markov Chains
- Markov chains and mixing times. With a chapter on ``Coupling from the past by James G. Propp and David B. Wilson.
- Mengerian theorems for paths of bounded length
- On the girth of random Cayley graphs
- Random graph models of social networks
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Rejoinder: Latent variable graphical model selection via convex optimization
- Short cycles in random regular graphs
- Statistical mechanics of complex networks
- The Complexity of Distinguishing Markov Random Fields
Cited in
(28)- Sparse model selection in the highly under-sampled regime
- Objective Bayesian edge screening and structure selection for Ising networks
- Computational implications of reducing data to sufficient statistics
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Universality of the mean-field for the Potts model
- High-dimensional Ising model selection with Bayesian information criteria
- Learning a tree-structured Ising model in order to make predictions
- Updating of the Gaussian graphical model through targeted penalized estimation
- Causal Structural Learning via Local Graphs
- Joint estimation of parameters in Ising model
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- An expectation maximization algorithm for high-dimensional model selection for the Ising model with misclassified states*
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- AMP chain graphs: minimal separators and structure learning algorithms
- Graphical Models and Message-Passing Algorithms: Some Introductory Lectures
- Property testing in high-dimensional Ising models
- A decomposition-based algorithm for learning the structure of multivariate regression chain graphs
- High-dimensional structure estimation in Ising models: local separation criterion
- Tensor recovery in high-dimensional Ising models
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Bayesian model selection for high-dimensional Ising models, with applications to educational data
- Empirical comparison study of approximate methods for structure selection in binary graphical models
- Nonconcave penalized composite conditional likelihood estimation of sparse Ising models
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- Inequalities on partial correlations in Gaussian graphical models containing star shapes
- A global approach for learning sparse Ising models
This page was built for publication: High-dimensional structure estimation in Ising models: local separation criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q693728)