Scalable inference for Markov processes with intractable likelihoods
From MaRDI portal
Publication:5963547
Abstract: Bayesian inference for Markov processes has become increasingly relevant in recent years. Problems of this type often have intractable likelihoods and prior knowledge about model rate parameters is often poor. Markov Chain Monte Carlo (MCMC) techniques can lead to exact inference in such models but in practice can suffer performance issues including long burn-in periods and poor mixing. On the other hand approximate Bayesian computation techniques can allow rapid exploration of a large parameter space but yield only approximate posterior distributions. Here we consider the combined use of approximate Bayesian computation (ABC) and MCMC techniques for improved computational efficiency while retaining exact inference on parallel hardware.
Recommendations
- Markov chain Monte Carlo inference for Markov jump processes via the linear noise approximation
- Parameter estimation for hidden Markov models with intractable likelihoods
- An adaptive sequential Monte Carlo method for approximate Bayesian computation
- Bayesian inference in the presence of intractable normalizing functions
- Likelihood-free MCMC
Cites work
- A comparative review of dimension reduction methods in approximate Bayesian computation
- Adaptive approximate Bayesian computation
- Constructing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation. With discussion and authors' reply
- Delayed acceptance particle MCMC for exact inference in stochastic kinetic models
- Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator
- Estimation of parameters for macroparasite population evolution using approximate Bayesian computation
- On some properties of Markov chain Monte Carlo simulation methods based on the particle filter
- On the efficiency of pseudo-marginal random walk Metropolis algorithms
- Optimal scaling for various Metropolis-Hastings algorithms.
- Particle Markov Chain Monte Carlo Methods
- Sequential Monte Carlo Methods in Practice
- Sequential Monte Carlo Samplers
- Sequential Monte Carlo without likelihoods
- Stochastic modeling of aphid population growth with nonlinear, power-law dynamics
- Stochastic modelling for systems biology.
- The pseudo-marginal approach for efficient Monte Carlo computations
- Weak convergence and optimal scaling of random walk Metropolis algorithms
Cited in
(15)- Augmented pseudo-marginal Metropolis-Hastings for partially observed diffusion processes
- Accelerating inference for stochastic kinetic models
- Parallel inference for big data with the group Bayesian method
- A partitioned quasi-likelihood for distributed statistical inference
- Inference on high-dimensional implicit dynamic models using a guided intermediate resampling filter
- Bayesian inference in the presence of intractable normalizing functions
- Particle MCMC algorithms and architectures for accelerating inference in state-space models
- Direct likelihood-based inference for discretely observed stochastic compartmental models of infectious disease
- Likelihood free inference for Markov processes: a comparison
- Patterns of scalable Bayesian inference
- Markov chain Monte Carlo inference for Markov jump processes via the linear noise approximation
- Bayesian computation: a summary of the current state, and samples backwards and forwards
- Unbiased Bayesian inference for population Markov jump processes via random truncations
- Birth/birth-death processes and their computable transition probabilities with biological applications
- Introduction to ``Scalable inference for Markov processes with intractable likelihoods by J. Owen, D. Wilkinson, C. Gillespie
This page was built for publication: Scalable inference for Markov processes with intractable likelihoods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5963547)