Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
DOI10.1137/130929904zbMath1322.65008arXiv1302.2213OpenAlexW2963018747MaRDI QIDQ2945165
Publication date: 9 September 2015
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.2213
computational complexitygroundwater flowMetropolis-Hastings algorithmMarkov chain Monte Carlo methodsBayesian inverse problemselliptic inverse problemnon-Gaussian prior measures
Computational methods in Markov chains (60J22) Probabilistic methods, particle methods, etc. for boundary value problems involving PDEs (65N75) Monte Carlo methods (65C05) Flows in porous media; filtration; seepage (76S05) Inverse problems for PDEs (35R30) Numerical analysis or methods applied to Markov chains (65C40) Complexity and performance of numerical algorithms (65Y20) Numerical methods for inverse problems for boundary value problems involving PDEs (65N21)
Related Items (8)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Diffusion limits of the random walk Metropolis algorithm in high dimensions
- Besov priors for Bayesian inverse problems
- Spectral gaps for a Metropolis-Hastings algorithm in infinite dimensions
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Rigorous confidence bounds for MCMC under a geometric drift condition
- Dirichlet forms and symmetric Markov processes.
- Error estimates of finite element methods for parameter identifications in elliptic and parabolic systems
- Optimal scaling of random walk Metropolis algorithms with discontinuous target densities
- Positivity of hit-and-run and related algorithms
- General state space Markov chains and MCMC algorithms
- On variance conditions for Markov chain CLTs
- Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions
- A note on Metropolis-Hastings kernels for general state spaces
- Comparison theorems for reversible Markov chains
- Exponential convergence of Langevin distributions and their discrete approximations
- Geometric ergodicity and hybrid Markov chains
- Geometric ergodicity of Metropolis algorithms
- Statistical and computational inverse problems.
- Rates of convergence of the Hastings and Metropolis algorithms
- Convergence properties of the Gibbs sampler for perturbations of Gaussians
- CLTs and asymptotic variance of time-sampled Markov chains
- Complexity analysis of accelerated MCMC methods for Bayesian inversion
- Sparse deterministic approximation of Bayesian inverse problems
- Inverse problems: A Bayesian perspective
- Explicit error bounds for Markov chain Monte Carlo
- Sparse tensor discretizations of high-dimensional parametric and stochastic PDEs
- Uncertainty Quantification and Weak Approximation of an Elliptic Inverse Problem
- Geometric L2 and L1 convergence are equivalent for reversible Markov chains
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Handbook of Markov Chain Monte Carlo
- Markov Chains and Stochastic Stability
- Bayesian inverse problems for functions and applications to fluid mechanics
- Séminaire de Probabilités XXXVI
- Dirichlet forms: Some infinite‐dimensional examples
- Galerkin Finite Element Approximations of Stochastic Elliptic Partial Differential Equations
- Nonasymptotic mixing of the MALA algorithm
- Random Fields and Geometry
- Fixed Precision MCMC Estimation by Median of Products of Averages
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
This page was built for publication: Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors