Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency
From MaRDI portal
Publication:2188103
Abstract: The Bayesian formulation of inverse problems is attractive for three primary reasons: it provides a clear modelling framework; means for uncertainty quantification; and it allows for principled learning of hyperparameters. The posterior distribution may be explored by sampling methods, but for many problems it is computationally infeasible to do so. In this situation maximum a posteriori (MAP) estimators are often sought. Whilst these are relatively cheap to compute, and have an attractive variational formulation, a key drawback is their lack of invariance under change of parameterization. This is a particularly significant issue when hierarchical priors are employed to learn hyperparameters. In this paper we study the effect of the choice of parameterization on MAP estimators when a conditionally Gaussian hierarchical prior distribution is employed. Specifically we consider the centred parameterization, the natural parameterization in which the unknown state is solved for directly, and the noncentred parameterization, which works with a whitened Gaussian as the unknown state variable, and arises when considering dimension-robust MCMC algorithms; MAP estimation is well-defined in the nonparametric setting only for the noncentred parameterization. However, we show that MAP estimates based on the noncentred parameterization are not consistent as estimators of hyperparameters; conversely, we show that limits of finite-dimensional centred MAP estimators are consistent as the dimension tends to infinity. We also consider empirical Bayesian hyperparameter estimation, show consistency of these estimates, and demonstrate that they are more robust with respect to noise than centred MAP estimates. An underpinning concept throughout is that hyperparameters may only be recovered up to measure equivalence, a well-known phenomenon in the context of the Ornstein-Uhlenbeck process.
Recommendations
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems
- Maximum a posteriori estimates in linear inverse problems with log-concave priors are proper Bayes estimators
- scientific article; zbMATH DE number 1114393
- scientific article; zbMATH DE number 1098876
Cites work
- A Note on Consistent Estimation of Multivariate Parameters in Ergodic Diffusion Models
- A general framework for the parametrization of hierarchical models
- An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach
- Analysis of Boundary Effects on PDE-Based Sampling of Whittle--Matérn Random Fields
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Bayes procedures for adaptive inference in inverse problems for the white noise model
- Bayesian inverse problems with Gaussian priors
- Bayesian inverse problems with non-conjugate priors
- Bayesian inverse problems with partial observations
- Bayesian learning for neural networks
- Bayesian posterior contraction rates for linear severely ill-posed inverse problems
- Bayesian recovery of the initial condition for the heat equation
- Bernstein-von Mises theorems for statistical inverse problems. I: Schrödinger equation
- Bernstein-von Mises theorems for statistical inverse problems. II: Compound Poisson processes
- Convergence Rates for Penalized Least Squares Estimators in PDE Constrained Regression Problems
- Estimation of the Hurst parameter from discrete noisy data
- Generalized Modes in Bayesian Inverse Problems
- Hierarchical Bayesian level set inversion
- Inverse problems: a Bayesian perspective
- Linear inverse problems for generalised random variables
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- MCMC METHODS FOR DIFFUSION BRIDGES
- MCMC methods for functions: modifying old algorithms to make them faster
- Machine learning. A probabilistic perspective
- Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems
- Mitigating the influence of the boundary on PDE-based covariance operators
- Non-Gaussian statistical inverse problems. II: Posterior convergence for approximated unknowns
- Non-Gaussian statistical inverse problems. Part I: Posterior distributions
- Nonparametric statistical inference for drift vector fields of multi-dimensional diffusions
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- On the brittleness of Bayesian inference
- Parameterizations for ensemble Kalman inversion
- Posterior contraction in Bayesian inverse problems under Gaussian priors
- Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems
- Rates of contraction of posterior distributions based on \(p\)-exponential priors
- Sequential Monte Carlo methods for Bayesian elliptic inverse problems
- Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems
- Statistical and computational inverse problems.
- Weak convergence and empirical processes. With applications to statistics
- Well-posed stochastic extensions of ill-posed linear problems
- Whittle-Matérn priors for Bayesian statistical inversion with applications in electrical impedance tomography
Cited in
(11)- Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems
- A Bayesian approach for consistent reconstruction of inclusions
- Uncertainty Quantification of Inclusion Boundaries in the Context of X-Ray Tomography
- Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems
- Hybrid iterative ensemble smoother for history matching of hierarchical models
- Hyperparameter estimation using a resolution matrix for Bayesian sensing
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- Convergence of Gaussian process regression with estimated hyper-parameters and applications in Bayesian inverse problems
- Non-centered parametric variational Bayes’ approach for hierarchical inverse problems of partial differential equations
- Consistency of empirical Bayes and kernel flow for hierarchical parameter estimation
- Convergence rates for ansatz‐free data‐driven inference in physically constrained problems
This page was built for publication: Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2188103)