Accelerating Metropolis-within-Gibbs sampler with localized computations of differential equations
From MaRDI portal
Abstract: Inverse problem is ubiquitous in science and engineering, and Bayesian methodologies are often used to infer the underlying parameters. For high dimensional temporal-spatial models, classical Markov chain Monte Carlo (MCMC) methods are often slow to converge, and it is necessary to apply Metropolis-within-Gibbs (MwG) sampling on parameter blocks. However, the computation cost of each MwG iteration is typically , where is the model dimension. This can be too expensive in practice. This paper introduces a new reduced computation method to bring down the computation cost to , for the inverse initial value problem of a stochastic differential equation (SDE) with local interactions. The key observation is that each MwG proposal is only different from the original iterate at one parameter block, and this difference will only propagate within a local domain in the SDE computations. Therefore we can approximate the global SDE computation with a surrogate updated only within the local domain for reduced computation cost. Both theoretically and numerically, we show that the approximation errors can be controlled by the local domain size. We discuss how to implement the local computation scheme using Euler--Maruyama and 4th order Runge--Kutta methods. We numerically demonstrate the performance of the proposed method with the Lorenz 96 model and a linear stochastic flow model.
Recommendations
- Sequential Monte Carlo methods for high-dimensional inverse problems: a case study for the Navier-Stokes equations
- An adaptive reduced basis ANOVA method for high-dimensional Bayesian inverse problems
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- A data-driven and model-based accelerated Hamiltonian Monte Carlo method for Bayesian elliptic inverse problems
- Complexity analysis of accelerated MCMC methods for Bayesian inversion
Cites work
- scientific article; zbMATH DE number 5871504 (Why is no real title available?)
- scientific article; zbMATH DE number 3565627 (Why is no real title available?)
- scientific article; zbMATH DE number 840151 (Why is no real title available?)
- Bayesian inverse problems for functions and applications to fluid mechanics
- Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms
- Data assimilation: methods, algorithms, and applications
- Equation of state calculations by fast computing machines
- Filter accuracy for the Lorenz 96 model: fixed versus adaptive observation operators
- Filtering complex turbulent systems.
- Inverse problems. Basics, theory and applications in geophysics
- Inverse problems: a Bayesian perspective
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- Monte Carlo sampling methods using Markov chains and their applications
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- Performance analysis of local ensemble Kalman filter
- Sampling-Based Approaches to Calculating Marginal Densities
- Spatial localization for nonlinear dynamical stochastic models for excitable media
- Statistical and computational inverse problems.
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- The pseudo-marginal approach for efficient Monte Carlo computations
Cited in
(4)- Convergence acceleration of ensemble Kalman inversion in nonlinear settings
- MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
- Sequential Monte Carlo methods for high-dimensional inverse problems: a case study for the Navier-Stokes equations
- A CVAE-within-Gibbs sampler for Bayesian linear inverse problems with hyperparameters
This page was built for publication: Accelerating Metropolis-within-Gibbs sampler with localized computations of differential equations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2195847)