Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors
From MaRDI portal
Publication:5213327
DOI10.1088/1361-6420/ab4d92zbMath1464.62365OpenAlexW2980939245MaRDI QIDQ5213327
Alexander Strang, Monica Pragliola, Erkki Somersalo, Daniela Calvetti
Publication date: 3 February 2020
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1088/1361-6420/ab4d92
Related Items
A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors ⋮ Sampling-based Spotlight SAR Image Reconstruction from Phase History Data for Speckle Reduction and Uncertainty Quantification ⋮ Automatic fidelity and regularization terms selection in variational image restoration ⋮ Empirical Bayesian Inference Using a Support Informed Prior ⋮ A CVAE-within-Gibbs sampler for Bayesian linear inverse problems with hyperparameters ⋮ Sparsity promoting reconstructions via hierarchical prior models in diffuse optical tomography ⋮ Generalized Sparse Bayesian Learning and Application to Image Reconstruction ⋮ Sequential edge detection using joint hierarchical Bayesian learning ⋮ Adaptive anisotropic Bayesian meshing for inverse problems ⋮ Inducing sparsity via the horseshoe prior in imaging problems ⋮ On and Beyond Total Variation Regularization in Imaging: The Role of Space Variance ⋮ Hierarchical ensemble Kalman methods with sparsity-promoting generalized gamma hyperpriors ⋮ Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments ⋮ Sparsity Promoting Hybrid Solvers for Hierarchical Bayesian Inverse Problems ⋮ Bayesian Mesh Adaptation for Estimating Distributed Parameters ⋮ Where Bayes tweaks Gauss: conditionally Gaussian priors for stable multi-dipole estimation ⋮ Overcomplete representation in a hierarchical Bayesian framework ⋮ Bayesian hierarchical dictionary learning
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Preconditioned iterative methods for linear discrete ill-posed problems from a Bayesian inversion perspective
- Proximal Markov chain Monte Carlo algorithms
- Functional error estimators for the adaptive discretization of inverse problems
- Conditionally Gaussian Hypermodels for Cerebral Source Localization
- Sparsity-promoting Bayesian inversion
- A hierarchical Krylov–Bayes iterative inverse solver for MEG with physiological preconditioning
- Hypermodels in the Bayesian imaging framework
- Iteratively reweighted least squares minimization for sparse recovery
- Lower bounds on the maximum cross correlation of signals (Corresp.)
- Convergence of an Iterative Method for Total Variation Denoising
- Hierachical Bayesian models and sparsity: ℓ 2 -magic
- Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems
- Bayes Meets Krylov: Statistically Inspired Preconditioners for CGLS
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Efficient Bayesian Computation by Proximal Markov Chain Monte Carlo: When Langevin Meets Moreau
- Sparse Approximate Solutions to Linear Systems
- Sparsity regularization in inverse problems
- Priorconditioned CGLS-Based Quasi-MAP Estimate, Statistical Stopping Rule, and Ranking of Priors
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Stable signal recovery from incomplete and inaccurate measurements
- Priorconditioners for linear systems
- Methods of conjugate gradients for solving linear systems