Fast sampling in a linear-Gaussian inverse problem
From MaRDI portal
Publication:3179326
Abstract: We solve the inverse problem of deblurring a pixelized image of Jupiter using regularized deconvolution and by sample-based Bayesian inference. By efficiently sampling the marginal posterior distribution for hyperparameters, then the full conditional for the deblurred image, we find that we can evaluate the posterior mean faster than regularized inversion, when selection of the regularizing parameter is considered. To our knowledge, this is the first demonstration of sampling and inference that takes less compute time than regularized inversion in an inverse problems. Comparison to random-walk Metropolis-Hastings and block Gibbs MCMC shows that marginal then conditional sampling also outperforms these more common sampling algorithms, having better scaling with problem size. When problem-specific computations are feasible the asymptotic cost of an independent sample is one linear solve, implying that sample-based Bayesian inference may be performed directly over function spaces, when that limit exists.
Recommendations
- Low-rank independence samplers in hierarchical Bayesian inverse problems
- Fast Gibbs sampling for high-dimensional Bayesian inversion
- Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors
- Fast Bayesian inversion for high dimensional inverse problems
- Fast inference for statistical inverse problems
Cites work
- scientific article; zbMATH DE number 3942888 (Why is no real title available?)
- scientific article; zbMATH DE number 193258 (Why is no real title available?)
- scientific article; zbMATH DE number 556557 (Why is no real title available?)
- scientific article; zbMATH DE number 1124118 (Why is no real title available?)
- scientific article; zbMATH DE number 2020395 (Why is no real title available?)
- scientific article; zbMATH DE number 1865746 (Why is no real title available?)
- scientific article; zbMATH DE number 849920 (Why is no real title available?)
- A Bayesian linear model for the high-dimensional inverse problem of seismic tomography
- A general framework for the parametrization of hierarchical models
- A primer on space-time modeling from a Bayesian perspective
- An invariant form for the prior probability in estimation problems
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions
- Computational Methods for Inverse Problems
- Efficient Gaussian Sampling for Solving Large-Scale Inverse Problems Using MCMC
- Estimates of the trace of the inverse of a symmetric matrix using the modified Chebyshev algorithm
- Gaussian Markov Random Fields
- Harold Jeffreys's \textit{Theory of probability} revisited
- Inverse acoustic and electromagnetic scattering theory.
- MCMC-based image reconstruction with uncertainty quantification
- Marginal Markov chain Monte Carlo methods
- Monte Carlo errors with less errors
- Numerical Methods in Scientific Computing, Volume I
- Overall objective priors
- Pattern theory. From representation to inference.
- Radiocarbon Dating with Temporal Order Constraints
- Regularization tools: A Matlab package for analysis and solution of discrete ill-posed problems
- Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models
- Spatiotemporal Hierarchical Bayesian Modeling Tropical Ocean Surface Winds
- The Use of the L-Curve in the Regularization of Discrete Ill-Posed Problems
- Traces and determinants of linear operators
Cited in
(21)- Fast sampling of parameterised Gaussian random fields
- Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems
- Efficient Marginalization-Based MCMC Methods for Hierarchical Bayesian Inverse Problems
- Rank bounds for approximating Gaussian densities in the tensor-train format
- Sampling Strategies for Fast Updating of Gaussian Markov Random Fields
- Large-scale Bayesian spatial-temporal regression with application to cardiac MR-perfusion imaging
- Point spread function estimation in X-ray imaging with partially collapsed Gibbs sampling
- MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
- Sampled limited memory methods for massive linear inverse problems
- Bayesian estimation and uncertainty quantification in models of urea hydrolysis byE. colibiofilms
- Polynomial Accelerated Solutions to a Large Gaussian Model for Imaging Biofilms: In Theory and Finite Precision
- Hybrid iterative ensemble smoother for history matching of hierarchical models
- Randomized reduced forward models for efficient Metropolis-Hastings MCMC, with application to subsurface fluid flow and capacitance tomography
- Sampling hyperparameters in hierarchical models: Improving on Gibbs for high-dimensional latent fields and large datasets
- Low-rank independence samplers in hierarchical Bayesian inverse problems
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- A blocking scheme for dimension-robust Gibbs sampling in large-scale image deblurring
- Fast iterative solution of sparsely sampled seismic inverse problems
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- A CVAE-within-Gibbs sampler for Bayesian linear inverse problems with hyperparameters
- Sampling linear inverse problems with noise
This page was built for publication: Fast sampling in a linear-Gaussian inverse problem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3179326)