Bayesian inference with optimal maps
From MaRDI portal
Abstract: We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selection through automatic evaluation of the marginal likelihood. We demonstrate the accuracy and efficiency of the approach on nonlinear inverse problems of varying dimension, involving the inference of parameters appearing in ordinary and partial differential equations.
Recommendations
- Transport map accelerated Markov chain Monte Carlo
- A multiscale strategy for Bayesian inference using transport maps
- Adaptive construction of surrogates for the Bayesian solution of inverse problems
- Dimension-independent likelihood-informed MCMC
- A Randomized Maximum A Posteriori Method for Posterior Sampling of High Dimensional Nonlinear Bayesian Inverse Problems
Cites work
- scientific article; zbMATH DE number 578421 (Why is no real title available?)
- scientific article; zbMATH DE number 2117879 (Why is no real title available?)
- scientific article; zbMATH DE number 840151 (Why is no real title available?)
- scientific article; zbMATH DE number 5043156 (Why is no real title available?)
- A Bayesian approach to characterizing uncertainty in inverse problems using coarse and fine-scale information
- A dynamical systems framework for intermittent data assimilation
- A stochastic Newton MCMC method for large-scale statistical inverse problems with application to seismic inversion
- A stochastic collocation approach to Bayesian inference in inverse problems
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- An analysis of polynomial chaos approximations for modeling single-fluid-phase flow in porous medium systems
- Bayes Factors
- Contributions to the theory of convex bodies
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Equation of state calculations by fast computing machines
- Existence and uniqueness of monotone measure-preserving maps
- From Knothe's Transport to Brenier's Map and a Continuation Method for Optimal Transport
- Gaussian processes for machine learning.
- Implicit particle filters for data assimilation
- Inference from simulations and monitoring convergence
- Inverse Problem Theory and Methods for Model Parameter Estimation
- Inverse problems: a Bayesian perspective
- Jensen’s inequality
- Langevin-type models II: Self-targeting candidates for MCMC algorithms
- Learn from thy neighbor: parallel-chain and regional adaptive MCMC
- MCMC methods for functions: modifying old algorithms to make them faster
- MCMC using Hamiltonian dynamics
- Monte Carlo sampling methods using Markov chains and their applications
- Monte Carlo strategies in scientific computing.
- Nonparametric estimation of diffusions: a differential equations approach
- Nonstationary inverse problems and state estimation
- Numerical methods for stochastic computations. A spectral method approach.
- On adaptive choice of shifts in rational Krylov subspace reduction of evolutionary problems
- Optimal Transport
- Physical Systems with Random Uncertainties: Chaos Representations with Arbitrary Probability Measure
- Polar factorization and monotone rearrangement of vector‐valued functions
- Preconditioning Markov Chain Monte Carlo Simulations Using Coarse-Scale Models
- Remarks on a Multivariate Transformation
- Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
- Robust Stochastic Approximation Approach to Stochastic Programming
- Sampling the posterior: an approach to non-Gaussian data assimilation
- Sequential Monte Carlo Methods in Practice
- Spectral Methods for Uncertainty Quantification
- Stochastic spectral methods for efficient Bayesian solution of inverse problems
- The Regularity of Mappings with a Convex Potential
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- The geometry of optimal transportation
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- Using Bayesian statistics in the estimation of heat source in radiation
Cited in
(92)- Algorithms for Kullback-Leibler approximation of probability measures in infinite dimensions
- Particle-based energetic variational inference
- Inference via low-dimensional couplings
- Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints
- p-kernel Stein variational gradient descent for data assimilation and history matching
- Efficient Bayesian inference with latent Hamiltonian neural networks in no-U-turn sampling
- Learning physics-based models from data: perspectives from inverse problems and model reduction
- Entropical optimal transport, Schrödinger's system and algorithms
- Stein variational reduced basis Bayesian inversion
- Sampling-free linear Bayesian update of polynomial chaos representations
- Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion
- Iterative construction of Gaussian process surrogate models for Bayesian inference
- Quantum Wasserstein isometries on the qubit state space
- MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
- Latent simplex position model: high dimensional multi-view clustering with uncertainty quantification
- Scaling limits in computational Bayesian inversion
- scientific article; zbMATH DE number 7626758 (Why is no real title available?)
- Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization
- Expectation propagation for nonlinear inverse problems -- with an application to electrical impedance tomography
- Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference
- Minimization for conditional simulation: relationship to optimal transport
- Bayesian inference via projections
- Bayesian inverse problems and Kalman filters
- Data assimilation: the Schrödinger perspective
- Multilevel Monte Carlo for smoothing via transport methods
- Combining push-forward measures and Bayes' rule to construct consistent solutions to stochastic inverse problems
- Diffusion map-based algorithm for gain function approximation in the feedback particle filter
- Interacting Langevin diffusions: gradient structure and ensemble Kalman sampler
- Bayesian inference of random fields represented with the Karhunen-Loève expansion
- Spectral likelihood expansions for Bayesian inference
- Coupling techniques for nonlinear ensemble filtering
- Gaussian functional regression for linear partial differential equations
- A multiscale strategy for Bayesian inference using transport maps
- Fokker-Planck particle systems for Bayesian inference: computational approaches
- Generative stochastic modeling of strongly nonlinear flows with non-Gaussian statistics
- A modified randomized maximum likelihood for improved Bayesian history matching
- Efficient derivative-free Bayesian inference for large-scale inverse problems
- Sequential Monte Carlo with kernel embedded mappings: the mapping particle filter
- Sequential ensemble transform for Bayesian inverse problems
- Transport map accelerated Markov chain Monte Carlo
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Bayesian model inversion using stochastic spectral embedding
- A note on parametric Bayesian inference via gradient flows
- Affine-mapping based variational ensemble Kalman filter
- Bayesian inverse problems with \(l_1\) priors: a randomize-then-optimize approach
- A new network approach to Bayesian inference in partial differential equations
- Implicit sampling for hierarchical Bayesian inversion and applications in fractional multiscale diffusion models
- Computation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting)
- Goal-oriented optimal approximations of Bayesian linear inverse problems
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Sparse approximation of triangular transports. I: The finite-dimensional case
- Sparse approximation of triangular transports. II: The infinite-dimensional case
- A hierarchically low-rank optimal transport dissimilarity measure for structured data
- Sampling, feasibility, and priors in data assimilation
- Iterative importance sampling algorithms for parameter estimation
- Deterministic mean-field ensemble Kalman filtering
- Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems
- Bayesian inversion using adaptive polynomial chaos kriging within subset simulation
- Sparse variational Bayesian approximations for nonlinear inverse problems: applications in nonlinear elastography
- A transport-based multifidelity preconditioner for Markov chain Monte Carlo
- Diffeomorphic density matching by optimal information transport
- A Continuation Method in Bayesian Inference
- A guided sequential Monte Carlo method for the assimilation of data into stochastic dynamical systems
- Solving linear Bayesian inverse problems using a fractional total variation-Gaussian (FTG) prior and transport map
- Transport map accelerated adaptive importance sampling, and application to inverse problems arising from multiscale stochastic reaction networks
- A minimum free energy model of motor learning
- Some models are useful, but how do we know which ones? Towards a unified Bayesian model taxonomy
- Bayesian nonparametric generative modeling of large multivariate non-Gaussian spatial fields
- Transport map sampling with PGD model reduction for fast dynamical Bayesian data assimilation
- Target-aware Bayesian inference: how to beat optimal conventional estimators
- Ensemble transport smoothing. I: Unified framework
- Reduced space dynamics-based geo-statistical prior sampling for uncertainty quantification of end goal decisions
- On the representation and learning of monotone triangular transport maps
- Parametrization of Random Vectors in Polynomial Chaos Expansions via Optimal Transportation
- A dimension-reduced variational approach for solving physics-based inverse problems using generative adversarial network priors and normalizing flows
- Horseshoe Priors for Edge-Preserving Linear Bayesian Inversion
- Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate
- Optimal experimental design: formulations and computations
- Transport Monte Carlo: High-Accuracy Posterior Approximation via Random Transport
- Variational Bayesian optimal experimental design with normalizing flows
- A distributed framework for the construction of transport maps
- Scalable Bayesian Transport Maps for High-Dimensional Non-Gaussian Spatial Fields
- Data‐driven physics‐based digital twins via a library of component‐based reduced‐order models
- Fast \(L^2\) optimal mass transport via reduced basis methods for the Monge-Ampère equation
- scientific article; zbMATH DE number 762942 (Why is no real title available?)
- The transport map computed by iterated function system
- Bayesian learning with Wasserstein barycenters
- Adaptive mesh methods on compact manifolds via optimal transport and optimal information transport
- Learning to solve Bayesian inverse problems: an amortized variational inference approach using Gaussian and flow guides
- Ensemble transport adaptive importance sampling
- Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events
- Conditional sampling with monotone GANs: from generative models to likelihood-free inference
This page was built for publication: Bayesian inference with optimal maps
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q695159)