Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods
From MaRDI portal
Publication:2323943
DOI10.1214/19-EJS1585zbMATH Open1431.62339arXiv1805.12571OpenAlexW2970068706MaRDI QIDQ2323943FDOQ2323943
Authors: Jimmy Olsson, Tatjana Pavlenko, Felix L. Rios
Publication date: 13 September 2019
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Abstract: We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.
Full work available at URL: https://arxiv.org/abs/1805.12571
Recommendations
- Improving structure MCMC for Bayesian networks through Markov blanket resampling
- Structural Markov graph laws for Bayesian model uncertainty
- A Gibbs sampler for learning DAGs
- Structure learning in Bayesian networks of a moderate size by efficient sampling
- Learning undirected graphical models using persistent sequential Monte Carlo
Monte Carlo methods (65C05) Graphical methods in statistics (62A09) Stochastic approximation (62L20)
Cites Work
- Experiments in stochastic computation for high-dimensional graphical models
- Sequential Monte Carlo Samplers
- A conjugate prior for discrete hierarchical log-linear models
- Hyper Markov laws in the statistical analysis of decomposable graphical models
- Inference in hidden Markov models.
- A fast procedure for model search in multidimensional contingency tables
- Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window
- Title not available (Why is that?)
- Particle Markov Chain Monte Carlo Methods
- Markov chain Monte Carlo model determination for hierarchical and graphical log-linear models
- Gaussian Markov distributions over finite graphs
- Decomposable graphical Gaussian model determination
- On particle Gibbs sampling
- Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods
- Efficient local updates for undirected graphical models
- Bayesian clustering in decomposable graphs
- Two methods for the generation of chordal graphs
- Sampling decomposable graphs using a Markov chain on junction trees
- Sequential sampling of junction trees for decomposable graphs
- A structural Markov property for decomposable graph laws that allows control of clique intersections
Cited In (3)
Uses Software
This page was built for publication: Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2323943)