The Bayesian update: variational formulations and gradient flows
From MaRDI portal
Publication:2297229
Abstract: The Bayesian update can be viewed as a variational problem by characterizing the posterior as the minimizer of a functional. The variational viewpoint is far from new and is at the heart of popular methods for posterior approximation. However, some of its consequences seem largely unexplored. We focus on the following one: defining the posterior as the minimizer of a functional gives a natural path towards the posterior by moving in the direction of steepest descent of the functional. This idea is made precise through the theory of gradient flows, allowing to bring new tools to the study of Bayesian models and algorithms. Since the posterior may be characterized as the minimizer of different functionals, several variational formulations may be considered. We study three of them and their three associated gradient flows. We show that, in all cases, the rate of convergence of the flows to the posterior can be bounded by the geodesic convexity of the functional to be minimized. Each gradient flow naturally suggests a nonlinear diffusion with the posterior as invariant distribution. These diffusions may be discretized to build proposals for Markov chain Monte Carlo (MCMC) algorithms. By construction, the diffusions are guaranteed to satisfy a certain optimality condition, and rates of convergence are given by the convexity of the functionals. We use this observation to propose a criterion for the choice of metric in Riemannian MCMC methods.
Recommendations
Cites work
- scientific article; zbMATH DE number 1153603 (Why is no real title available?)
- scientific article; zbMATH DE number 1909499 (Why is no real title available?)
- A convexity principle for interacting gases
- A course in metric geometry
- A graph discretization of the Laplace-Beltrami operator
- Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions
- Continuum limits of posteriors in graph Bayesian inverse problems
- Displacement convexity of generalized relative entropies
- Error estimates for spectral convergence of the graph Laplacian on random geometric graphs toward the Laplace-Beltrami operator
- Exponential convergence of Langevin distributions and their discrete approximations
- Gradient flows in metric spaces and in the space of probability measures
- Graphical models, exponential families, and variational inference
- Kullback-Leibler approximation for probability measures on infinite dimensional spaces
- MCMC methods for functions: modifying old algorithms to make them faster
- On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms
- On the geometry of metric measure spaces. I
- Optimal Transport
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- Ordinary differential equations and dynamical systems
- Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
- Stochastic processes and applications. Diffusion processes, the Fokker-Planck and Langevin equations
- The Variational Formulation of the Fokker--Planck Equation
- Transport inequalities, gradient estimates, entropy and Ricci curvature
- Uncertainty quantification in graph-based classification of high dimensional data
Cited in
(11)- A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors
- Differential equation-constrained optimization with stochasticity
- Accelerated information gradient flow
- Doob's consistency of a non-Bayesian updating process
- A geometric variational approach to Bayesian inference
- Wasserstein steepest descent flows of discrepancies with Riesz kernels
- Bayesian updating with two-step parallel Bayesian optimization and quadrature
- Stein variational gradient descent: many-particle and long-time asymptotics
- On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms
- A note on parametric Bayesian inference via gradient flows
- Bayesian Probabilistic Numerical Methods
This page was built for publication: The Bayesian update: variational formulations and gradient flows
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2297229)