Geometric MCMC for infinite-dimensional inverse problems

From MaRDI portal
Publication:1685436

DOI10.1016/j.jcp.2016.12.041zbMath1375.35627arXiv1606.06351OpenAlexW2462335633MaRDI QIDQ1685436

Shiwei Lan, Andrew M. Stuart, Alexandros Beskos, Mark A. Girolami, Patrick E. Farrell

Publication date: 14 December 2017

Published in: Journal of Computational Physics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1606.06351



Related Items

Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference, Statistical Finite Elements via Langevin Dynamics, Continuum limit and preconditioned Langevin sampling of the path integral molecular dynamics, Physics-informed machine learning with conditional Karhunen-Loève expansions, Learning physics-based models from data: perspectives from inverse problems and model reduction, Variational Bayesian approximation of inverse problems using sparse precision matrices, Multilevel Sequential Monte Carlo with Dimension-Independent Likelihood-Informed Proposals, Consistency of Bayesian inference with Gaussian process priors for a parabolic inverse problem, An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks, Bayesian neural network priors for edge-preserving inversion, A unified performance analysis of likelihood-informed subspace methods, Localization of Moving Sources: Uniqueness, Stability, and Bayesian Inference, Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks, Non-reversible guided Metropolis kernel, Laplace priors and spatial inhomogeneity in Bayesian inverse problems, On the accept-reject mechanism for Metropolis-Hastings algorithms, Dimension‐independent Markov chain Monte Carlo on the sphere, Large-scale Bayesian optimal experimental design with derivative-informed projected neural network, Multilevel Delayed Acceptance MCMC, Semi-supervised invertible neural operators for Bayesian inverse problems, Scalable Optimization-Based Sampling on Function Space, Bayesian spatiotemporal modeling for inverse problems, Chilled sampling for uncertainty quantification: a motivation from a meteorological inverse problem *, Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning, Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem, A Bayesian Approach to Estimating Background Flows from a Passive Scalar, Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset, Hierarchical Matrix Approximations of Hessians Arising in Inverse Problems Governed by PDEs, Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo, Demonstration of the relationship between sensitivity and identifiability for inverse uncertainty quantification, Non-stationary multi-layered Gaussian priors for Bayesian inversion, Bayesian inference of heterogeneous epidemic models: application to COVID-19 spread accounting for long-term care facilities, Multimodal Bayesian registration of noisy functions using Hamiltonian Monte Carlo, Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo, Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion, The statistical finite element method (statFEM) for coherent synthesis of observation data and model predictions, Bayesian inversion of a diffusion model with application to biology, Bernstein--von Mises Theorems and Uncertainty Quantification for Linear Inverse Problems, Ensemble sampler for infinite-dimensional inverse problems, Generalized parallel tempering on Bayesian inverse problems, Data assimilation: The Schrödinger perspective, Non-stationary phase of the MALA algorithm, Statistical guarantees for Bayesian uncertainty quantification in nonlinear inverse problems with Gaussian process priors, Two Metropolis--Hastings Algorithms for Posterior Measures with Non-Gaussian Priors in Infinite Dimensions, Multilevel Markov Chain Monte Carlo for Bayesian Inversion of Parabolic Partial Differential Equations under Gaussian Prior, Stein Variational Reduced Basis Bayesian Inversion, Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo, Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization, Analysis of a multilevel Markov chain Monte Carlo finite element method for Bayesian inversion of log-normal diffusions, Optimal experimental design for infinite-dimensional Bayesian inverse problems governed by PDEs: a review, Mixing rates for Hamiltonian Monte Carlo algorithms in finite and infinite dimensions, Data-free likelihood-informed dimension reduction of Bayesian inverse problems


Uses Software


Cites Work