On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems
DOI10.1007/S00211-020-01131-1zbMATH Open1446.65098arXiv1901.03958OpenAlexW3043749415MaRDI QIDQ2194045FDOQ2194045
Authors: Claudia Schillings, Björn Sprungk, Philipp Wacker
Publication date: 25 August 2020
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1901.03958
Recommendations
- Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems
- Convergence rate for the Bayesian approach to linear inverse problems
- A Bayesian approach to multiscale inverse problems using the sequential Monte Carlo method
- Quasi-Monte Carlo Bayesian estimation under Besov priors in elliptic inverse problems
- Convergence analysis of surrogate-based methods for Bayesian inverse problems
- Approximation of Bayesian Inverse Problems for PDEs
- On the well-posedness of Bayesian inverse problems
- Sequential Monte Carlo methods for Bayesian elliptic inverse problems
- Bayesian inverse problems with \(l_1\) priors: a randomize-then-optimize approach
- A hierarchical Bayesian setting for an inverse problem in linear parabolic PDEs with noisy boundary conditions
Convergence of probability measures (60B10) Bayesian inference (62F15) Monte Carlo methods (65C05) Numerical optimization and variational techniques (65K10) Bayesian problems; characterization of Bayes procedures (62C10) Random measures (60G57) Stability and convergence of numerical methods for initial value and initial-boundary value problems involving PDEs (65M12) Error bounds for initial value and initial-boundary value problems involving PDEs (65M15) Numerical methods for inverse problems for initial value and initial-boundary value problems involving PDEs (65M32)
Cites Work
- Application of quasi-Monte Carlo methods to elliptic PDEs with random diffusion coefficients: a survey of analysis and implementation
- Asymptotic Statistics
- Statistical and computational inverse problems.
- On Choosing and Bounding Probability Metrics
- On the consistency of Bayes estimates
- Convergence rates of posterior distributions.
- Combinatorics of partial derivatives
- Nonparametric Bernstein-von Mises theorems in Gaussian white noise
- On the Bernstein-von Mises phenomenon for nonparametric Bayes procedures
- Bernstein-von Mises theorems for statistical inverse problems. I: Schrödinger equation
- On the Bernstein-von Mises theorem with infinite-dimensional parameters
- Inverse problems: a Bayesian perspective
- An analysis of Bayesian inference for nonparametric regression
- Frequentist coverage of adaptive nonparametric Bayesian credible sets
- Title not available (Why is that?)
- The Bernstein-von Mises theorem under misspecification
- Asymptotic approximations of integrals
- High-dimensional integration: The quasi-Monte Carlo way
- A stochastic collocation approach to Bayesian inference in inverse problems
- Sparse, adaptive Smolyak quadratures for Bayesian inverse problems
- Complexity analysis of accelerated MCMC methods for Bayesian inversion
- Higher order quasi-Monte Carlo integration for holomorphic, parametric operator equations
- Sparsity in Bayesian inversion of parametric operator equations
- A Hierarchical Multilevel Markov Chain Monte Carlo Algorithm with Applications to Uncertainty Quantification in Subsurface Flow
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- Fast estimation of expected information gains for Bayesian experimental designs based on Laplace approximations
- A Fast and Scalable Method for A-Optimal Design of Experiments for Infinite-dimensional Bayesian Nonlinear Inverse Problems
- Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems
- On the Bernstein-von Mises approximation of posterior distributions
- Gaussian approximations for probability measures on \(\mathbb R^d\)
- Multilevel higher-order quasi-Monte Carlo Bayesian estimation
- A Review of Modern Computational Algorithms for Bayesian Optimal Design
- Kullback-Leibler approximation for probability measures on infinite dimensional spaces
- Quasi-Monte Carlo and multilevel Monte Carlo methods for computing posterior expectations in elliptic inverse problems
- Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain
- Scaling limits in computational Bayesian inversion
- Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
Cited In (31)
- On the Error Rate of Importance Sampling with Randomized Quasi-Monte Carlo
- Shape reconstructions by using plasmon resonances
- Residual-based error correction for neural operator accelerated Infinite-dimensional Bayesian inverse problems
- Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems
- Wasserstein convergence rates of increasingly concentrating probability measures
- Certified dimension reduction in nonlinear Bayesian inverse problems
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Optimal experimental design: formulations and computations
- Multimodal information gain in Bayesian design of experiments
- A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors
- Determination of the reaction coefficient in a time dependent nonlocal diffusion process
- Learning physics-based models from data: perspectives from inverse problems and model reduction
- Dimension Free Nonasymptotic Bounds on the Accuracy of High-Dimensional Laplace Approximation
- Shape reconstructions by using plasmon resonances with enhanced sensitivity
- Unbiased MLMC Stochastic Gradient-Based Optimization of Bayesian Experimental Designs
- Scaling limits in computational Bayesian inversion
- The computational asymptotics of Gaussian variational inference and the Laplace approximation
- Sparse approximation of triangular transports. I: The finite-dimensional case
- Small-noise approximation for Bayesian optimal experimental design with nuisance uncertainty
- Bayesian neural network priors for edge-preserving inversion
- Isogeometric multilevel quadrature for forward and inverse random acoustic scattering
- Horseshoe Priors for Edge-Preserving Linear Bayesian Inversion
- A Hybrid Gibbs Sampler for Edge-Preserving Tomographic Reconstruction with Uncertain View Angles
- Re-thinking high-dimensional mathematical statistics. Abstracts from the workshop held May 15--21, 2022
- On log-concave approximations of high-dimensional posterior measures and stability properties in non-linear inverse problems
- Adaptive operator learning for infinite-dimensional Bayesian inverse problems
- Context-Aware Surrogate Modeling for Balancing Approximation and Sampling Costs in Multifidelity Importance Sampling and Bayesian Inverse Problems
- Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization
- Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion
- Multilevel Monte Carlo estimation of the expected value of sample information
- Quasi-Monte Carlo and multilevel Monte Carlo methods for computing posterior expectations in elliptic inverse problems
Uses Software
This page was built for publication: On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2194045)