Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency
DOI10.5802/SMAI-JCM.62zbMATH Open1441.62084arXiv1905.04365OpenAlexW3019884851MaRDI QIDQ2188103FDOQ2188103
T. Helin, M. M. Dunlop, A. M. Stuart
Publication date: 3 June 2020
Published in: The SMAI journal of computational mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1905.04365
Recommendations
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems
- Maximum a posteriori estimates in linear inverse problems with log-concave priors are proper Bayes estimators
- scientific article; zbMATH DE number 1114393
- scientific article; zbMATH DE number 1098876
optimizationnonparametric inferenceBayesian inverse problemsMAP estimationhierarchical Bayesianconsistency of estimatorsmaximum a posteriori (MAP)hyperparameter inference
Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Bayesian problems; characterization of Bayes procedures (62C10) Inverse problems for integral equations (45Q05)
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- An Explicit Link between Gaussian Fields and Gaussian Markov Random Fields: The Stochastic Partial Differential Equation Approach
- Statistical and computational inverse problems.
- Mitigating the influence of the boundary on PDE-based covariance operators
- Machine learning. A probabilistic perspective
- A general framework for the parametrization of hierarchical models
- Whittle-Matérn priors for Bayesian statistical inversion with applications in electrical impedance tomography
- Nonparametric statistical inference for drift vector fields of multi-dimensional diffusions
- Bernstein-von Mises theorems for statistical inverse problems. II: Compound Poisson processes
- Bernstein-von Mises theorems for statistical inverse problems. I: Schrödinger equation
- Estimation of the Hurst parameter from discrete noisy data
- Bayesian posterior contraction rates for linear severely ill-posed inverse problems
- Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems
- Inverse problems: a Bayesian perspective
- Bayesian inverse problems with non-conjugate priors
- Non-Gaussian statistical inverse problems. Part I: Posterior distributions
- Non-Gaussian statistical inverse problems. II: Posterior convergence for approximated unknowns
- Posterior Contraction in Bayesian Inverse Problems Under Gaussian Priors
- Linear inverse problems for generalised random variables
- Bayesian inverse problems with Gaussian priors
- Bayes procedures for adaptive inference in inverse problems for the white noise model
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Bayesian Recovery of the Initial Condition for the Heat Equation
- Well-posed stochastic extensions of ill-posed linear problems
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- MCMC methods for functions: modifying old algorithms to make them faster
- Bayesian learning for neural networks
- MCMC METHODS FOR DIFFUSION BRIDGES
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- Sequential Monte Carlo methods for Bayesian elliptic inverse problems
- Rates of contraction of posterior distributions based on \(p\)-exponential priors
- Bayesian inverse problems with partial observations
- Parameterizations for ensemble Kalman inversion
- On the brittleness of Bayesian inference
- A Note on Consistent Estimation of Multivariate Parameters in Ergodic Diffusion Models
- Convergence Rates for Penalized Least Squares Estimators in PDE Constrained Regression Problems
- Maximuma posterioriprobability estimates in infinite-dimensional Bayesian inverse problems
- Hierarchical Bayesian level set inversion
- Analysis of Boundary Effects on PDE-Based Sampling of Whittle--Matérn Random Fields
- Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems
- Generalized Modes in Bayesian Inverse Problems
Cited In (10)
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- A Bayesian approach for consistent reconstruction of inclusions
- Consistency of empirical Bayes and kernel flow for hierarchical parameter estimation
- Convergence rates for ansatz‐free data‐driven inference in physically constrained problems
- Hybrid iterative ensemble smoother for history matching of hierarchical models
- Hyperparameter estimation using a resolution matrix for Bayesian sensing
- Uncertainty Quantification of Inclusion Boundaries in the Context of X-Ray Tomography
- Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems
- Convergence of Gaussian Process Regression with Estimated Hyper-Parameters and Applications in Bayesian Inverse Problems
- Non-centered parametric variational Bayes’ approach for hierarchical inverse problems of partial differential equations
Uses Software
This page was built for publication: Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2188103)