Bayesian inference with optimal maps
From MaRDI portal
Publication:695159
DOI10.1016/j.jcp.2012.07.022zbMath1318.62087arXiv1109.1516OpenAlexW2023103251MaRDI QIDQ695159
Tarek A. El Moselhy, Youssef M. Marzouk
Publication date: 20 December 2012
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1109.1516
inverse problemsmeasure-preserving mapsBayesian inferencenumerical optimizationoptimal transportpolynomial chaos
Related Items
Sequential ensemble transform for Bayesian inverse problems, Learning physics-based models from data: perspectives from inverse problems and model reduction, Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion, Bayesian model inversion using stochastic spectral embedding, Efficient derivative-free Bayesian inference for large-scale inverse problems, Coupling Techniques for Nonlinear Ensemble Filtering, Bayesian inversion using adaptive polynomial chaos kriging within subset simulation, Fast $L^2$ Optimal Mass Transport via Reduced Basis Methods for the Monge--Ampère Equation, Parametrization of Random Vectors in Polynomial Chaos Expansions via Optimal Transportation, Transport Map Accelerated Markov Chain Monte Carlo, Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints, Algorithms for Kullback--Leibler Approximation of Probability Measures in Infinite Dimensions, A Multiscale Strategy for Bayesian Inference Using Transport Maps, Reduced Space Dynamics-Based Geo-Statistical Prior Sampling for Uncertainty Quantification of End Goal Decisions, Entropical optimal transport, Schrödinger's system and algorithms, Diffeomorphic Density Matching by Optimal Information Transport, Expectation propagation for nonlinear inverse problems -- with an application to electrical impedance tomography, Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference, Minimization for conditional simulation: relationship to optimal transport, Unnamed Item, Multilevel Monte Carlo for Smoothing via Transport Methods, Sampling-free linear Bayesian update of polynomial chaos representations, Sparse variational Bayesian approximations for nonlinear inverse problems: applications in nonlinear elastography, Sparse approximation of triangular transports. I: The finite-dimensional case, Sparse approximation of triangular transports. II: The infinite-dimensional case, Generative Stochastic Modeling of Strongly Nonlinear Flows with Non-Gaussian Statistics, A new network approach to Bayesian inference in partial differential equations, Scaling limits in computational Bayesian inversion, Transport Monte Carlo: High-Accuracy Posterior Approximation via Random Transport, Data‐driven physics‐based digital twins via a library of component‐based reduced‐order models, Adaptive mesh methods on compact manifolds via optimal transport and optimal information transport, Interacting Langevin Diffusions: Gradient Structure and Ensemble Kalman Sampler, Ensemble transport smoothing. I: Unified framework, Solving linear Bayesian inverse problems using a fractional total variation-Gaussian (FTG) prior and transport map, Horseshoe Priors for Edge-Preserving Linear Bayesian Inversion, A Continuation Method in Bayesian Inference, Bayesian learning with Wasserstein barycenters, Quantum Wasserstein isometries on the qubit state space, Some models are useful, but how do we know which ones? Towards a unified Bayesian model taxonomy, Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events, A dimension-reduced variational approach for solving physics-based inverse problems using generative adversarial network priors and normalizing flows, Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate, Computation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting), Iterative Importance Sampling Algorithms for Parameter Estimation, Diffusion Map-based Algorithm for Gain Function Approximation in the Feedback Particle Filter, Bayesian Inverse Problems with $l_1$ Priors: A Randomize-Then-Optimize Approach, Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems, A modified randomized maximum likelihood for improved Bayesian history matching, Sequential Monte Carlo with kernel embedded mappings: the mapping particle filter, Transport Map Accelerated Adaptive Importance Sampling, and Application to Inverse Problems Arising from Multiscale Stochastic Reaction Networks, MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure, A Distributed Framework for the Construction of Transport Maps, Unnamed Item, Bayesian inference of random fields represented with the Karhunen-Loève expansion, Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion, Spectral likelihood expansions for Bayesian inference, p-kernel Stein variational gradient descent for data assimilation and history matching, Deterministic Mean-Field Ensemble Kalman Filtering, Gaussian functional regression for linear partial differential equations, A Minimum Free Energy Model of Motor Learning, Iterative construction of Gaussian process surrogate models for Bayesian inference, A transport-based multifidelity preconditioner for Markov chain Monte Carlo, Particle-based energetic variational inference, Implicit sampling for hierarchical Bayesian inversion and applications in fractional multiscale diffusion models, Ensemble Transport Adaptive Importance Sampling, Data assimilation: The Schrödinger perspective, Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems, Unnamed Item, Stein Variational Reduced Basis Bayesian Inversion, Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization, A Guided Sequential Monte Carlo Method for the Assimilation of Data into Stochastic Dynamical Systems, Bayesian Inverse Problems and Kalman Filters, Deep composition of tensor-trains using squared inverse Rosenblatt transports, A hierarchically low-rank optimal transport dissimilarity measure for structured data, Affine-mapping based variational ensemble Kalman filter, Sampling, feasibility, and priors in data assimilation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A dynamical systems framework for intermittent data assimilation
- Implicit particle filters for data assimilation
- Contributions to the theory of convex bodies
- Stochastic spectral methods for efficient Bayesian solution of inverse problems
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- The geometry of optimal transportation
- Existence and uniqueness of monotone measure-preserving maps
- Langevin-type models II: Self-targeting candidates for MCMC algorithms
- Sampling the posterior: an approach to non-Gaussian data assimilation
- An analysis of polynomial chaos approximations for modeling single-fluid-phase flow in porous medium systems
- Monte Carlo strategies in scientific computing.
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- Using Bayesian statistics in the estimation of heat source in radiation
- Sequential Monte Carlo Methods in Practice
- Inverse problems: A Bayesian perspective
- A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
- Nonparametric estimation of diffusions: a differential equations approach
- On Adaptive Choice of Shifts in Rational Krylov Subspace Reduction of Evolutionary Problems
- From Knothe's Transport to Brenier's Map and a Continuation Method for Optimal Transport
- Learn From Thy Neighbor: Parallel-Chain and Regional Adaptive MCMC
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- Spectral Methods for Uncertainty Quantification
- Robust Stochastic Approximation Approach to Stochastic Programming
- Polar factorization and monotone rearrangement of vector‐valued functions
- The Regularity of Mappings with a Convex Potential
- Nonstationary inverse problems and state estimation
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Physical Systems with Random Uncertainties: Chaos Representations with Arbitrary Probability Measure
- Inverse Problem Theory and Methods for Model Parameter Estimation
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Bayes Factors
- Equation of State Calculations by Fast Computing Machines
- A Bayesian approach to characterizing uncertainty in inverse problems using coarse and fine-scale information
- Preconditioning Markov Chain Monte Carlo Simulations Using Coarse-Scale Models
- Monte Carlo sampling methods using Markov chains and their applications
- Jensen’s inequality
- Remarks on a Multivariate Transformation
- Optimal Transport
- MCMC methods for functions: modifying old algorithms to make them faster