Bayesian Imaging Using Plug & Play Priors: When Langevin Meets Tweedie
From MaRDI portal
Publication:5094615
Recommendations
- Bayesian Imaging with Data-Driven Priors Encoded by Neural Networks
- Efficient Bayesian Computation for Low-Photon Imaging Problems
- Hypermodels in the Bayesian imaging framework
- On the Bayesian approach to image reconstruction
- A Bayesian approach to synthetic magnetic resonance imaging
- A proximal Markov chain Monte Carlo method for Bayesian inference in imaging inverse problems: when Langevin meets Moreau
Cites work
- scientific article; zbMATH DE number 1432028 (Why is no real title available?)
- scientific article; zbMATH DE number 3411371 (Why is no real title available?)
- Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method
- An algorithm for total variation minimization and applications
- Approximate models and robust decisions
- Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising
- Breaking the curse of dimensionality with convex neural networks
- Convex analysis and monotone operator theory in Hilbert spaces
- DeepISP: Toward Learning an End-to-End Image Processing Pipeline
- Efficient Bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau
- Exponential convergence of Langevin distributions and their discrete approximations
- From Denoising to Compressed Sensing
- High-dimensional mixture models for unsupervised image denoising (HDMI)
- Inverse problems: a Bayesian perspective
- Learning maximally monotone operators for image recovery
- Markov random fields for vision and image processing
- Nonasymptotic convergence analysis for the unadjusted Langevin algorithm
- Nonlinear total variation based noise removal algorithms
- On the well-posedness of Bayesian inverse problems
- Optimal Transport
- Plug in estimation in high dimensional linear inverse problems a rigorous analysis
- Plug-and-Play Unplugged: Optimization-Free Reconstruction Using Consensus Equilibrium
- Posterior expectation of the total variation model: properties and experiments
- Proximal Markov chain Monte Carlo algorithms
- Regularization by denoising via fixed-point projection (RED-PRO)
- Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
- Scalable Bayesian uncertainty quantification in imaging inverse problems via convex optimization
- Solving Inverse Problems With Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity
- Solving inverse problems using data-driven models
- State evolution for general approximate message passing algorithms, with applications to spatial coupling
- The Bayesian Choice
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities
- Tweedie’s Formula and Selection Bias
- Variational networks: an optimal control approach to early stopping variational methods for image restoration
- What regularized auto-encoders learn from the data-generating distribution
Cited in
(26)- NF-ULA: normalizing flow-based unadjusted Langevin algorithm for imaging inverse problems
- Efficient Bayesian Computation for Low-Photon Imaging Problems
- A proximal Markov chain Monte Carlo method for Bayesian inference in imaging inverse problems: when Langevin meets Moreau
- CUQIpy: I. Computational uncertainty quantification for inverse problems in Python
- On maximum a posteriori estimation with Plug \& Play priors and stochastic gradient descent
- Accelerated Bayesian imaging by relaxed proximal-point Langevin sampling
- Marginal likelihood estimation in semiblind image deconvolution: a stochastic approximation approach
- Training adaptive reconstruction networks for blind inverse problems
- PnP-ReG: Learned Regularizing Gradient for Plug-and-Play Gradient Descent
- Posterior-Variance-Based Error Quantification for Inverse Problems in Imaging
- Self-Supervised Deep Learning for Image Reconstruction: A Langevin Monte Carlo Approach
- Bayesian Inverse Problems Are Usually Well-Posed
- Provably Convergent Plug-and-Play Quasi-Newton Methods
- Proximal Langevin sampling with inexact proximal mapping
- Image Denoising: The Deep Learning Revolution and Beyond—A Survey Paper
- Subgradient Langevin methods for sampling from nonsmooth potentials
- The Split Gibbs Sampler Revisited: Improvements to Its Algorithmic Structure and Augmented Target Distribution
- Noise-free sampling algorithms via regularized Wasserstein proximals
- Bayesian imaging using Plug & Play priors: when Langevin meets Tweedie
- Invertible residual networks in the context of regularization theory for linear inverse problems
- Neural-network-based regularization methods for inverse problems in imaging
- Bayesian Imaging with Data-Driven Priors Encoded by Neural Networks
- Learning from small data sets: patch-based regularizers in inverse problems for image reconstruction
- Robustness and exploration of variational and machine learning approaches to inverse problems: an overview
- Wasserstein steepest descent flows of discrepancies with Riesz kernels
- Asymptotic bias of inexact Markov chain Monte Carlo methods in high dimension
This page was built for publication: Bayesian Imaging Using Plug & Play Priors: When Langevin Meets Tweedie
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5094615)