Accelerated information gradient flow

From MaRDI portal
Publication:2053344

DOI10.1007/S10915-021-01709-3zbMATH Open1484.90059arXiv1909.02102OpenAlexW2971906246MaRDI QIDQ2053344FDOQ2053344


Authors: Wuchen Li, Yifei Wang Edit this on Wikidata


Publication date: 29 November 2021

Published in: Journal of Scientific Computing (Search for Journal in Brave)

Abstract: We present a framework for Nesterov's accelerated gradient flows in probability space to design efficient mean-field Markov chain Monte Carlo (MCMC) algorithms for Bayesian inverse problems. Here four examples of information metrics are considered, including Fisher-Rao metric, Wasserstein-2 metric, Kalman-Wasserstein metric and Stein metric. For both Fisher-Rao and Wasserstein-2 metrics, we prove convergence properties of accelerated gradient flows. In implementations, we propose a sampling-efficient discrete-time algorithm for Wasserstein-2, Kalman-Wasserstein and Stein accelerated gradient flows with a restart technique. We also formulate a kernel bandwidth selection method, which learns the gradient of logarithm of density from Brownian-motion samples. Numerical experiments, including Bayesian logistic regression and Bayesian neural network, show the strength of the proposed methods compared with state-of-the-art algorithms.


Full work available at URL: https://arxiv.org/abs/1909.02102




Recommendations




Cites Work


Cited In (13)

Uses Software





This page was built for publication: Accelerated information gradient flow

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2053344)