MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure
From MaRDI portal
Publication:3300855
DOI10.1137/19M1284014zbMath1444.62073arXiv1908.09429OpenAlexW3036554994MaRDI QIDQ3300855
Youssef M. Marzouk, Matthias Morzfeld, Xin Thomson Tong
Publication date: 30 July 2020
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.09429
Markov chain Monte CarloBayesian computationhigh-dimensional distributionsMetropolis-adjusted Langevin algorithm (MALA)
Estimation in multivariate analysis (62H12) Analysis of algorithms and problem complexity (68Q25) Monte Carlo methods (65C05)
Related Items (10)
Constrained ensemble Langevin Monte Carlo ⋮ Spectral gap of replica exchange Langevin diffusion on mixture distributions ⋮ Sparse approximation of triangular transports. I: The finite-dimensional case ⋮ A CVAE-within-Gibbs sampler for Bayesian linear inverse problems with hyperparameters ⋮ Localized ensemble Kalman inversion ⋮ A unified performance analysis of likelihood-informed subspace methods ⋮ Multiscale sampling for the inverse modeling of partial differential equations ⋮ Analysis of a localised nonlinear ensemble Kalman Bucy filter with complete and accurate observations ⋮ Integrative methods for post-selection inference under convex constraints ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Spectral gaps for a Metropolis-Hastings algorithm in infinite dimensions
- Bayesian inference with optimal maps
- Monte Carlo errors with less errors
- Parameter estimation by implicit sampling
- Optimal scaling for partially updating MCMC algorithms
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- Hierarchical models: local proposal variances for RWM-within-Gibbs and MALA-within-Gibbs
- Importance sampling: intrinsic dimension and computational cost
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- Random-walk interpretations of classical iteration methods
- Accelerating Metropolis-within-Gibbs sampler with localized computations of differential equations
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
- Spatial localization for nonlinear dynamical stochastic models for excitable media
- Dimension-independent likelihood-informed MCMC
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- Optimal scalings for local Metropolis-Hastings chains on nonproduct targets in high dimensions
- Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials
- Optimal tuning of the hybrid Monte Carlo algorithm
- A stable manifold MCMC method for high dimensions
- Accelerated Dimension-Independent Adaptive Metropolis
- Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- Fast Sampling in a Linear-Gaussian Inverse Problem
- Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- Log Gaussian Cox Processes
- Rigorous Analysis for Efficient Statistically Accurate Algorithms for Solving Fokker--Planck Equations in Large Dimensions
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Stochastic Tools in Mathematics and Science
- Scalable Optimization-Based Sampling on Function Space
- Probabilistic Forecasting and Bayesian Data Assimilation
- Scaling Limits for the Transient Phase of Local Metropolis–Hastings Algorithms
- Data Assimilation
- A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems Part I: The Linearized Case, with Application to Global Seismic Inversion
- A function space HMC algorithm with second order Langevin diffusion limit
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- MCMC methods for functions: modifying old algorithms to make them faster
This page was built for publication: MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure