Learning a tree-structured Ising model in order to make predictions
DOI10.1214/19-AOS1808zbMATH Open1469.62334arXiv1604.06749MaRDI QIDQ2196190FDOQ2196190
Authors: Guy Bresler, Mina Karzand
Publication date: 28 August 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1604.06749
Recommendations
model selectionpredictionhigh-dimensional statisticsIsing modelMarkov random fieldstree modelChow-Liu treemaximum likelihood tree
Asymptotic properties of parametric estimators (62F12) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Learning and adaptive systems in artificial intelligence (68T05) Estimation in multivariate analysis (62H12) Inference from stochastic processes and prediction (62M20) (L^p)-limit theorems (60F25)
Cites Work
- Graphical models, exponential families, and variational inference
- High-dimensional graphs and variable selection with the Lasso
- Introduction to algorithms.
- Title not available (Why is that?)
- Probability Inequalities for Sums of Bounded Random Variables
- Gibbs measures and phase transitions.
- Approximating discrete probability distributions with dependence trees
- Can local particle filters beat the curse of dimensionality?
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Probabilistic graphical models.
- High-dimensional structure estimation in Ising models: local separation criterion
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- Learning with mixtures of trees.
- Forest density estimation
- Learning low-level vision
- Title not available (Why is that?)
- On the Approximability of Numerical Taxonomy (Fitting Distances by Tree Metrics)
- Introduction to nonparametric estimation
- Learning Markov networks: Maximum bounded tree-width graphs
- Image denoising using scale mixtures of gaussians in the wavelet domain
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Phylogenies without branch bounds: contracting the short, pruning the deep
- A few logs suffice to build (almost) all trees. II
- Title not available (Why is that?)
- Estimating the ``wrong graphical model: benefits in the computation-limited setting
- A Large-Deviation Analysis of the Maximum-Likelihood Learning of Markov Tree Structures
- Title not available (Why is that?)
- Learning factor graphs in polynomial time and sample complexity
- Random cascades on wavelet trees and their use in analyzing and modeling natural images
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- Information-Theoretic Limits of Selecting Binary Graphical Models in High Dimensions
- Learning loosely connected Markov random fields
Cited In (9)
- Generative modeling via tree tensor network states
- Bayesian model selection for high-dimensional Ising models, with applications to educational data
- Tensor recovery in high-dimensional Ising models
- Title not available (Why is that?)
- Stein's method for stationary distributions of Markov chains and application to Ising models
- The minimax learning rates of normal and Ising undirected graphical models
- Identifiability in robust estimation of tree structured models
- Learning polynomial transformations via generalized tensor decompositions
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
This page was built for publication: Learning a tree-structured Ising model in order to make predictions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196190)