Geometric MCMC for infinite-dimensional inverse problems
From MaRDI portal
Publication:1685436
DOI10.1016/j.jcp.2016.12.041zbMath1375.35627arXiv1606.06351OpenAlexW2462335633MaRDI QIDQ1685436
Shiwei Lan, Andrew M. Stuart, Alexandros Beskos, Mark A. Girolami, Patrick E. Farrell
Publication date: 14 December 2017
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.06351
Markov chain Monte Carlouncertainty quantificationBayesian inverse problemsinfinite dimensionslocal preconditioning
Related Items
Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference, Statistical Finite Elements via Langevin Dynamics, Continuum limit and preconditioned Langevin sampling of the path integral molecular dynamics, Physics-informed machine learning with conditional Karhunen-Loève expansions, Learning physics-based models from data: perspectives from inverse problems and model reduction, Variational Bayesian approximation of inverse problems using sparse precision matrices, Multilevel Sequential Monte Carlo with Dimension-Independent Likelihood-Informed Proposals, Consistency of Bayesian inference with Gaussian process priors for a parabolic inverse problem, An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks, Bayesian neural network priors for edge-preserving inversion, A unified performance analysis of likelihood-informed subspace methods, Localization of Moving Sources: Uniqueness, Stability, and Bayesian Inference, Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks, Non-reversible guided Metropolis kernel, Laplace priors and spatial inhomogeneity in Bayesian inverse problems, On the accept-reject mechanism for Metropolis-Hastings algorithms, Dimension‐independent Markov chain Monte Carlo on the sphere, Large-scale Bayesian optimal experimental design with derivative-informed projected neural network, Multilevel Delayed Acceptance MCMC, Semi-supervised invertible neural operators for Bayesian inverse problems, Scalable Optimization-Based Sampling on Function Space, Bayesian spatiotemporal modeling for inverse problems, Chilled sampling for uncertainty quantification: a motivation from a meteorological inverse problem *, Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning, Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem, A Bayesian Approach to Estimating Background Flows from a Passive Scalar, Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset, Hierarchical Matrix Approximations of Hessians Arising in Inverse Problems Governed by PDEs, Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo, Demonstration of the relationship between sensitivity and identifiability for inverse uncertainty quantification, Non-stationary multi-layered Gaussian priors for Bayesian inversion, Bayesian inference of heterogeneous epidemic models: application to COVID-19 spread accounting for long-term care facilities, Multimodal Bayesian registration of noisy functions using Hamiltonian Monte Carlo, Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo, Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion, The statistical finite element method (statFEM) for coherent synthesis of observation data and model predictions, Bayesian inversion of a diffusion model with application to biology, Bernstein--von Mises Theorems and Uncertainty Quantification for Linear Inverse Problems, Ensemble sampler for infinite-dimensional inverse problems, Generalized parallel tempering on Bayesian inverse problems, Data assimilation: The Schrödinger perspective, Non-stationary phase of the MALA algorithm, Statistical guarantees for Bayesian uncertainty quantification in nonlinear inverse problems with Gaussian process priors, Two Metropolis--Hastings Algorithms for Posterior Measures with Non-Gaussian Priors in Infinite Dimensions, Multilevel Markov Chain Monte Carlo for Bayesian Inversion of Parabolic Partial Differential Equations under Gaussian Prior, Stein Variational Reduced Basis Bayesian Inversion, Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo, Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization, Analysis of a multilevel Markov chain Monte Carlo finite element method for Bayesian inversion of log-normal diffusions, Optimal experimental design for infinite-dimensional Bayesian inverse problems governed by PDEs: a review, Mixing rates for Hamiltonian Monte Carlo algorithms in finite and infinite dimensions, Data-free likelihood-informed dimension reduction of Bayesian inverse problems
Uses Software
Cites Work
- Unnamed Item
- Spectral gaps for a Metropolis-Hastings algorithm in infinite dimensions
- A comparison of outlet boundary treatments for prevention of backflow divergence with relevance to blood flow simulations
- Automated solution of differential equations by the finite element method. The FEniCS book
- Hybrid Monte Carlo on Hilbert spaces
- Emulation of higher-order tensors in manifold Monte Carlo methods for Bayesian inverse problems
- A note on Metropolis-Hastings kernels for general state spaces
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- Advanced MCMC methods for sampling on diffusion pathspace
- Proposals which speed up function-space MCMC
- Dimension-independent likelihood-informed MCMC
- Langevin diffusions and the Metropolis-adjusted Langevin algorithm
- Accelerating Markov Chain Monte Carlo with Active Subspaces
- Automated Derivation of the Adjoint of High-Level Transient Finite Element Programs
- A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
- A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems, Part II: Stochastic Newton MCMC with Application to Ice Sheet Flow Inverse Problems
- Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo
- Likelihood-informed dimension reduction for nonlinear inverse problems
- Active Subspaces
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- The Geometry of Random Fields
- Algorithms for Kullback--Leibler Approximation of Probability Measures in Infinite Dimensions
- MCMC METHODS FOR DIFFUSION BRIDGES
- On the small balls problem for equivalent Gaussian measures
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Adaptive Hessian-Based Nonstationary Gaussian Process Response Surface Method for Probability Density Approximation with Application to Bayesian Solution of Large-Scale Inverse Problems
- Stochastic Equations in Infinite Dimensions
- MCMC methods for functions: modifying old algorithms to make them faster