Efficient MCMC for Gibbs random fields using pre-computation
From MaRDI portal
Abstract: Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the likelihood function is intractable. The exploration of the posterior distribution of such models is typically carried out with a sophisticated Markov chain Monte Carlo (MCMC) method, the exchange algorithm (Murray et al., 2006), which requires simulations from the likelihood function at each iteration. The purpose of this paper is to consider an approach to dramatically reduce this computational overhead. To this end we introduce a novel class of algorithms which use realizations of the GRF model, simulated offline, at locations specified by a grid that spans the parameter space. This strategy speeds up dramatically the posterior inference, as illustrated on several examples. However, using the pre-computed graphs introduces a noise in the MCMC algorithm, which is no longer exact. We study the theoretical behaviour of the resulting approximate MCMC algorithm and derive convergence bounds using a recent theoretical development on approximate MCMC methods.
Recommendations
- Model comparison for Gibbs random fields using noisy reversible jump Markov chain Monte Carlo
- On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods
- Markov-chain monte carlo: Some practical implications of theoretical results
- ABC likelihood-free methods for model choice in Gibbs random fields
- Generalised Gibbs sampler and multigrid Monte Carlo for Bayesian computation
Cites work
- scientific article; zbMATH DE number 3513115 (Why is no real title available?)
- scientific article; zbMATH DE number 840151 (Why is no real title available?)
- scientific article; zbMATH DE number 6781368 (Why is no real title available?)
- A Bayesian Reassessment of Nearest-Neighbor Classification
- A Stochastic Approximation Method
- A note on Metropolis-Hastings kernels for general state spaces
- Approximate Bayesian computational methods
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Equation of state calculations by fast computing machines
- Exact sampling with coupled Markov chains and applications to statistical mechanics
- Exploiting multi-core architectures for reduced-variance estimation with intractable likelihoods
- Gibbs measures and phase transitions.
- Instability, sensitivity, and degeneracy of discrete exponential families
- Mixing time of exponential random graphs
- Noisy Hamiltonian Monte Carlo for Doubly Intractable Distributions
- Noisy Monte Carlo: convergence of Markov chains with approximate transition kernels
- Optimum Monte-Carlo sampling using Markov chains
- Pre-processing for approximate Bayesian computation in image analysis
- Scalable Bayesian inference for the inverse temperature of a hidden Potts model
- Sequential Monte Carlo Samplers
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Speeding up MCMC by Delayed Acceptance and Data Subsampling
- Stability of noisy Metropolis-Hastings
- The pseudo-marginal approach for efficient Monte Carlo computations
Cited in
(7)- Scalable Bayesian inference for the inverse temperature of a hidden Potts model
- Approximate Bayesian inference for hierarchical Gaussian Markov random field models
- Model comparison for Gibbs random fields using noisy reversible jump Markov chain Monte Carlo
- Bayesian indirect inference for models with intractable normalizing functions
- Preconditioning Markov Chain Monte Carlo Simulations Using Coarse-Scale Models
- On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods
- Finding our way in the dark: approximate MCMC for approximate Bayesian methods
This page was built for publication: Efficient MCMC for Gibbs random fields using pre-computation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1711571)