Low-rank independence samplers in hierarchical Bayesian inverse problems
From MaRDI portal
Publication:4689166
Abstract: In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In practice, Markov chain Monte Carlo algorithms often are used to draw samples from the posterior distribution. However, implementations of such algorithms can be computationally expensive. We present a computationally efficient scheme for sampling high-dimensional Gaussian distributions in ill-posed Bayesian linear inverse problems. Our approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the prior-preconditioned Hessian. We show the dependence of the acceptance rate on the number of eigenvalues retained and discuss conditions under which the acceptance rate is high. We demonstrate our proposed sampler by using it with Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring, computerized tomography, and NMR relaxometry.
Recommendations
- Fast Gibbs sampling for high-dimensional Bayesian inversion
- Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors
- Dimension-independent likelihood-informed MCMC
- Optimal low-rank approximations of Bayesian linear inverse problems
- A Metropolis-Hastings-within-Gibbs sampler for nonlinear hierarchical-Bayesian inverse problems
Cites work
- scientific article; zbMATH DE number 5954259 (Why is no real title available?)
- scientific article; zbMATH DE number 194139 (Why is no real title available?)
- scientific article; zbMATH DE number 1243444 (Why is no real title available?)
- scientific article; zbMATH DE number 2117879 (Why is no real title available?)
- scientific article; zbMATH DE number 3189754 (Why is no real title available?)
- A Bayesian generalized CAR model for correlated signal detection
- A METROPOLIS-HASTINGS METHOD FOR LINEAR INVERSE PROBLEMS WITH POISSON LIKELIHOOD AND GAUSSIAN PRIOR
- A computational framework for infinite-dimensional Bayesian inverse problems. I: The linearized case, with application to global seismic inversion
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- A general framework for the parametrization of hierarchical models
- An exploration of aspects of Bayesian multiple testing
- An introduction to variational methods for graphical models
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Applications of a nonnegatively constrained iterative method with statistically based stopping rules to CT, PET, and SPECT imaging
- Applied multivariate statistical analysis.
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Bayesian Inversion of Geoelectrical Resistivity Data
- Bayesian data analysis.
- Bayesian methods for data analysis.
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- Discrete inverse problems. Insight and algorithms.
- Efficient MCMC-based image deblurring with Neumann boundary conditions
- Equation of state calculations by fast computing machines
- Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations
- Fast sampling in a linear-Gaussian inverse problem
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Gaussian Markov Random Fields
- Gaussian processes for machine learning.
- Inference from iterative simulation using multiple sequences
- Inverse problems in the Bayesian framework
- Inverse problems: a Bayesian perspective
- MCMC-based image reconstruction with uncertainty quantification
- Monte Carlo sampling methods using Markov chains and their applications
- Numerical methods for large eigenvalue problems
- Optimal low-rank approximations of Bayesian linear inverse problems
- Optimal proposal distributions and adaptive MCMC
- Partially Collapsed Gibbs Samplers
- Point spread function estimation in X-ray imaging with partially collapsed Gibbs sampling
- Practical sketching algorithms for low-rank matrix approximation
- Prior distributions for variance parameters in hierarchical models (Comment on article by Browne and Draper)
- Randomized algorithms for generalized Hermitian eigenvalue problems with application to computing Karhunen-Loève expansion.
- Regularization tools version \(4.0\) for matlab \(7.3\)
- Sampling-Based Approaches to Calculating Marginal Densities
- Statistical and computational inverse problems.
- Statistical decision theory and Bayesian analysis. 2nd ed
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- The design and analysis of computer experiments.
Cited in
(21)- Fast sampling in a linear-Gaussian inverse problem
- MCMC Algorithms for Computational UQ of Nonnegativity Constrained Linear Inverse Problems
- Randomized approaches to accelerate MCMC algorithms for Bayesian inverse problems
- Some rapidly mixing hit-and-run samplers for latent counts in linear inverse problems
- A Metropolis-Hastings-within-Gibbs sampler for nonlinear hierarchical-Bayesian inverse problems
- Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo
- Certified coordinate selection for high-dimensional Bayesian inversion with Laplace prior
- Hybrid samplers for ill-posed inverse problems
- Fast Gibbs sampling for high-dimensional Bayesian inversion
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- Implicit sampling for hierarchical Bayesian inversion and applications in fractional multiscale diffusion models
- Computationally efficient sampling methods for sparsity promoting hierarchical Bayesian models
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Efficient Marginalization-Based MCMC Methods for Hierarchical Bayesian Inverse Problems
- Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
- Efficient nonparametric Bayesian inference for X-ray transforms
- A Gaussian Process Emulator Based Approach for Bayesian Calibration of a Functional Input
This page was built for publication: Low-rank independence samplers in hierarchical Bayesian inverse problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4689166)