Low-rank independence samplers in hierarchical Bayesian inverse problems

From MaRDI portal
Publication:4689166

DOI10.1137/17M1137218zbMATH Open1401.65040arXiv1609.07180OpenAlexW2883157112WikidataQ129495354 ScholiaQ129495354MaRDI QIDQ4689166FDOQ4689166


Authors: D. Andrew Brown, Arvind K. Saibaba, Sarah Vallélian Edit this on Wikidata


Publication date: 15 October 2018

Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)

Abstract: In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In practice, Markov chain Monte Carlo algorithms often are used to draw samples from the posterior distribution. However, implementations of such algorithms can be computationally expensive. We present a computationally efficient scheme for sampling high-dimensional Gaussian distributions in ill-posed Bayesian linear inverse problems. Our approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the prior-preconditioned Hessian. We show the dependence of the acceptance rate on the number of eigenvalues retained and discuss conditions under which the acceptance rate is high. We demonstrate our proposed sampler by using it with Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring, computerized tomography, and NMR relaxometry.


Full work available at URL: https://arxiv.org/abs/1609.07180




Recommendations




Cites Work


Cited In (20)

Uses Software





This page was built for publication: Low-rank independence samplers in hierarchical Bayesian inverse problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4689166)