Emulation of higher-order tensors in manifold Monte Carlo methods for Bayesian inverse problems
From MaRDI portal
(Redirected from Publication:729447)
Abstract: The Bayesian approach to Inverse Problems relies predominantly on Markov Chain Monte Carlo methods for posterior inference. The typical nonlinear concentration of posterior measure observed in many such Inverse Problems presents severe challenges to existing simulation based inference methods. Motivated by these challenges the exploitation of local geometric information in the form of covariant gradients, metric tensors, Levi-Civita connections, and local geodesic flows, have been introduced to more effectively locally explore the configuration space of the posterior measure. However, obtaining such geometric quantities usually requires extensive computational effort and despite their effectiveness affect the applicability of these geometrically-based Monte Carlo methods. In this paper we explore one way to address this issue by the construction of an emulator of the model from which all geometric objects can be obtained in a much more computationally feasible manner. The main concept is to approximate the geometric quantities using a Gaussian Process emulator which is conditioned on a carefully chosen design set of configuration points, which also determines the quality of the emulator. To this end we propose the use of statistical experiment design methods to refine a potentially arbitrarily initialized design online without destroying the convergence of the resulting Markov chain to the desired invariant measure. The practical examples considered in this paper provide a demonstration of the significant improvement possible in terms of computational loading suggesting this is a promising avenue of further development.
Recommendations
- Geometric MCMC for infinite-dimensional inverse problems
- Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
- Adaptive construction of surrogates for the Bayesian solution of inverse problems
- Bayesian inverse problems with Monte Carlo forward models
Cites work
- scientific article; zbMATH DE number 3751955 (Why is no real title available?)
- scientific article; zbMATH DE number 597911 (Why is no real title available?)
- scientific article; zbMATH DE number 1077338 (Why is no real title available?)
- scientific article; zbMATH DE number 2061729 (Why is no real title available?)
- scientific article; zbMATH DE number 1522714 (Why is no real title available?)
- scientific article; zbMATH DE number 1529823 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 272681 (Why is no real title available?)
- A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code
- Adaptive Hessian-Based Nonstationary Gaussian Process Response Surface Method for Probability Density Approximation with Application to Bayesian Solution of Large-Scale Inverse Problems
- Adaptive Markov Chain Monte Carlo through Regeneration
- Bayesian Analysis of the Scatterometer Wind Retrieval Inverse Problem: Some New Approaches
- Bayesian calibration of computer models. (With discussion)
- Bayesian emulation of complex multi-output and dynamic computer models
- Bayesian solution uncertainty quantification for differential equations
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Exploratory designs for computational experiments
- Gaussian processes for machine learning.
- General Irreducible Markov Chains and Non-Negative Operators
- Geometric Numerical Integration
- Hierarchical interpolative factorization for elliptic operators: differential equations
- Hierarchical interpolative factorization for elliptic operators: integral equations
- MCMC using Hamiltonian dynamics
- Multi-output local Gaussian process regression: applications to uncertainty quantification
- NUTS
- Optimal monitoring network designs
- Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach
- Regeneration in Markov Chain Samplers
- Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
- Sequential Design with Mutual Information for Computer Experiments (MICE): Emulation of a Tsunami Model
- Simulating Hamiltonian Dynamics
- Solution of inverse problems with limited forward solver evaluations: a Bayesian perspective
- Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo
- The design and analysis of computer experiments.
- The geometric foundations of Hamiltonian Monte Carlo
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- Using emulators to estimate uncertainty in complex models
Cited in
(21)- Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks
- A data-driven and model-based accelerated Hamiltonian Monte Carlo method for Bayesian elliptic inverse problems
- On the accept-reject mechanism for Metropolis-Hastings algorithms
- Neural network gradient Hamiltonian Monte Carlo
- Probabilistic integration: a role in statistical computation?
- Multi-stage splitting integrators for sampling with modified Hamiltonian Monte Carlo methods
- Accelerating Monte Carlo estimation with derivatives of high-level finite element models
- Stein variational gradient descent with local approximations
- Special issue: Big data and predictive computational modeling
- The Bayesian formulation of EIT: analysis and algorithms
- Geometric MCMC for infinite-dimensional inverse problems
- Parameter inference based on Gaussian processes informed by nonlinear partial differential equations
- Emulation-accelerated Hamiltonian Monte Carlo algorithms for parameter estimation and uncertainty quantification in differential equation models
- Geometric adaptive Monte Carlo in random environment
- Calibrate, emulate, sample
- Applying kriging proxies for Markov chain Monte Carlo in reservoir simulation
- An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
- Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization
- Ensemble inference methods for models with noisy and expensive likelihoods
- A Gaussian Process Emulator Based Approach for Bayesian Calibration of a Functional Input
This page was built for publication: Emulation of higher-order tensors in manifold Monte Carlo methods for Bayesian inverse problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q729447)