Dimension-independent likelihood-informed MCMC
From MaRDI portal
Publication:2374891
Abstract: Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent, likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.
Recommendations
- Ensemble sampler for infinite-dimensional inverse problems
- Proposals which speed up function-space MCMC
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
Cites work
- scientific article; zbMATH DE number 646825 (Why is no real title available?)
- scientific article; zbMATH DE number 2045498 (Why is no real title available?)
- A Hierarchical Multilevel Markov Chain Monte Carlo Algorithm with Applications to Uncertainty Quantification in Subsurface Flow
- A computational framework for infinite-dimensional Bayesian inverse problems. I: The linearized case, with application to global seismic inversion
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- A note on Metropolis-Hastings kernels for general state spaces
- A sequential particle filter method for static models
- A stable manifold MCMC method for high dimensions
- A stochastic Newton MCMC method for large-scale statistical inverse problems with application to seismic inversion
- Active subspace methods in theory and practice: applications to kriging surfaces
- An adaptive Metropolis algorithm
- An adaptive version for the Metropolis adjusted Langevin algorithm with a truncated drift
- Complexity analysis of accelerated MCMC methods for Bayesian inversion
- Coupling and Ergodicity of Adaptive Markov Chain Monte Carlo Algorithms
- Diffusion limits of the random walk Metropolis algorithm in high dimensions
- Equation of state calculations by fast computing machines
- Exact and Computationally Efficient Likelihood-Based Estimation for Discretely Observed Diffusion Processes (with Discussion)
- Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Inverse Problem Theory and Methods for Model Parameter Estimation
- Inverse problems: a Bayesian perspective
- Langevin diffusions and Metropolis-Hastings algorithms
- Likelihood-informed dimension reduction for nonlinear inverse problems
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- MCMC METHODS FOR DIFFUSION BRIDGES
- MCMC methods for functions: modifying old algorithms to make them faster
- Monte Carlo sampling methods using Markov chains and their applications
- Multilevel Monte Carlo Path Simulation
- On the ergodicity of the adaptive Metropolis algorithm on unbounded domains
- On the ergodicity properties of some adaptive MCMC algorithms
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- Optimal scaling and diffusion limits for the Langevin algorithm in high dimensions
- Optimal scaling for various Metropolis-Hastings algorithms.
- Particle Markov Chain Monte Carlo Methods
- Proposals which speed up function-space MCMC
- Randomized algorithms for the low-rank approximation of matrices
- Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
- Sequential Monte Carlo Samplers
- Signal processing problems on function space: Bayesian formulation, stochastic PDEs and effective MCMC methods
- Spectral gaps for a Metropolis-Hastings algorithm in infinite dimensions
- Stochastic Equations in Infinite Dimensions
- Weak convergence and optimal scaling of random walk Metropolis algorithms
Cited in
(only showing first 100 items - show all)- Signal processing problems on function space: Bayesian formulation, stochastic PDEs and effective MCMC methods
- Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems
- Efficient Marginalization-Based MCMC Methods for Hierarchical Bayesian Inverse Problems
- Finite element representations of Gaussian processes: balancing numerical and statistical accuracy
- Ensemble sampler for infinite-dimensional inverse problems
- Forward and backward uncertainty quantification with active subspaces: application to hypersonic flows around a cylinder
- Generalized parallel tempering on Bayesian inverse problems
- Efficient estimation of hydraulic conductivity heterogeneity with non-redundant measurement information
- Bayesian inference with optimal maps
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- Affine invariant interacting Langevin dynamics for Bayesian inference
- Spatial localization for nonlinear dynamical stochastic models for excitable media
- Scalable Optimization-Based Sampling on Function Space
- Bayesian inversion of a diffusion model with application to biology
- Learning physics-based models from data: perspectives from inverse problems and model reduction
- Accelerating Markov chain Monte Carlo with active subspaces
- Multilevel Markov Chain Monte Carlo
- A stable manifold MCMC method for high dimensions
- Multimodal, high-dimensional, model-based, Bayesian inverse problems with applications in biomechanics
- Striated Metropolis-Hastings sampler for high-dimensional models
- Pass-efficient randomized algorithms for low-rank matrix approximation using any number of views
- Iterative construction of Gaussian process surrogate models for Bayesian inference
- MCMC methods for sampling function space
- A vine-copula based adaptive MCMC sampler for efficient inference of dynamical systems
- Non-stationary multi-layered Gaussian priors for Bayesian inversion
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
- Solving Bayesian inverse problems from the perspective of deep generative networks
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- A Bayesian level set method for an inverse medium scattering problem in acoustics
- Scaling limits in computational Bayesian inversion
- Efficient parameter estimation for a methane hydrate model with active subspaces
- On an adaptive preconditioned Crank-Nicolson MCMC algorithm for infinite dimensional Bayesian inference
- Probabilistic parameter estimation in a 2-step chemical kinetics model for n-dodecane jet autoignition
- A Bayesian method for an inverse transmission scattering problem in acoustics
- Survey of multifidelity methods in uncertainty propagation, inference, and optimization
- Proposals which speed up function-space MCMC
- Accelerated dimension-independent adaptive metropolis
- Geometric MCMC for infinite-dimensional inverse problems
- Certified dimension reduction in nonlinear Bayesian inverse problems
- Bayesian inference of random fields represented with the Karhunen-Loève expansion
- Low-rank independence samplers in hierarchical Bayesian inverse problems
- A multiscale strategy for Bayesian inference using transport maps
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- Rate-optimal refinement strategies for local approximation MCMC
- Data-driven forward discretizations for Bayesian inversion
- Data-free likelihood-informed dimension reduction of Bayesian inverse problems
- Bayesian inference of heterogeneous epidemic models: application to COVID-19 spread accounting for long-term care facilities
- Sequential ensemble transform for Bayesian inverse problems
- A hybrid adaptive MCMC algorithm in function spaces
- Image inversion and uncertainty quantification for constitutive laws of pattern formation
- Multilevel sequential Monte Carlo with dimension-independent likelihood-informed proposals
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- Bayesian inverse problems with \(l_1\) priors: a randomize-then-optimize approach
- Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks
- Hierarchical Matrix Approximations of Hessians Arising in Inverse Problems Governed by PDEs
- Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems
- A Randomized Maximum A Posteriori Method for Posterior Sampling of High Dimensional Nonlinear Bayesian Inverse Problems
- Goal-oriented optimal approximations of Bayesian linear inverse problems
- Iterative importance sampling algorithms for parameter estimation
- Optimization based methods for partially observed chaotic systems
- FEM-based discretization-invariant MCMC methods for PDE-constrained Bayesian inverse problems
- Wavelet-based priors accelerate maximum-a-posteriori optimization in Bayesian inverse problems
- An Adaptive Independence Sampler MCMC Algorithm for Bayesian Inferences of Functions
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- Optimal low-rank approximations of Bayesian linear inverse problems
- Particle Filtering for Stochastic Navier--Stokes Signal Observed with Linear Additive Noise
- Randomized maximum likelihood based posterior sampling
- Likelihood-free inference in high dimensions with synthetic likelihood
- A Bayesian level set method for the shape reconstruction of inverse scattering problems in elasticity
- Stein variational gradient descent on infinite-dimensional space and applications to statistical inverse problems
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
- Variational Bayes' Method for Functions with Applications to Some Inverse Problems
- Properties of the affine‐invariant ensemble sampler's ‘stretch move’ in high dimensions
- Principal feature detection via \(\phi \)-Sobolev inequalities
- Residual-based error correction for neural operator accelerated Infinite-dimensional Bayesian inverse problems
- Solution of physics-based inverse problems using conditional generative adversarial networks with full gradient penalty
- On posterior consistency of data assimilation with Gaussian process priors: the 2D-Navier-Stokes equations
- Solving linear Bayesian inverse problems using a fractional total variation-Gaussian (FTG) prior and transport map
- A Bayesian approach for consistent reconstruction of inclusions
- Large-scale Bayesian optimal experimental design with derivative-informed projected neural network
- Online MCMC Thinning with Kernelized Stein Discrepancy
- Optimal experimental design for infinite-dimensional Bayesian inverse problems governed by PDEs: a review
- On polynomial-time computation of high-dimensional posterior measures by Langevin-type algorithms
- A posteriori stochastic correction of reduced models in delayed-acceptance MCMC, with application to multiphase subsurface inverse problems
- Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference
- Multilevel hierarchical decomposition of finite element white noise with application to multilevel Markov chain Monte Carlo
- Consistent inference for diffusions from low frequency measurements
- Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning
- Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo
- Prior normalization for certified likelihood-informed subspace detection of Bayesian inverse problems
- Langevin diffusion for population based sampling with an application in Bayesian inference for pharmacodynamics
- Multilevel dimension-independent likelihood-informed MCMC for large-scale inverse problems
- Analysis of a Class of Multilevel Markov Chain Monte Carlo Algorithms Based on Independent Metropolis–Hastings
- Optimal experimental design: formulations and computations
- hIPPYlib-MUQ: a Bayesian inference software framework for integration of data with complex predictive models under uncertainty
- Adaptive inference over Besov spaces in the white noise model using \(p\)-exponential priors
- Sparse approximation of triangular transports. I: The finite-dimensional case
- A unified performance analysis of likelihood-informed subspace methods
- Two Metropolis--Hastings Algorithms for Posterior Measures with Non-Gaussian Priors in Infinite Dimensions
This page was built for publication: Dimension-independent likelihood-informed MCMC
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2374891)