Regularized R\'enyi divergence minimization through Bregman proximal gradient algorithms

From MaRDI portal
Publication:6416656

arXiv2211.04776MaRDI QIDQ6416656FDOQ6416656


Authors: Thomas Guilmeau, E. Chouzenoux, Víctor Elvira Edit this on Wikidata


Publication date: 9 November 2022

Abstract: We study the variational inference problem of minimizing a regularized R'enyi divergence over an exponential family, and propose a relaxed moment-matching algorithm, which includes a proximal-like step. Using the information-geometric link between Bregman divergences and the Kullback-Leibler divergence, this algorithm is shown to be equivalent to a Bregman proximal gradient algorithm. This novel perspective allows us to exploit the geometry of our approximate model while using stochastic black-box updates. We use this point of view to prove strong convergence guarantees including monotonic decrease of the objective, convergence to a stationary point or to the minimizer, and geometric convergence rates. These new theoretical insights lead to a versatile, robust, and competitive method, as illustrated by numerical experiments.













This page was built for publication: Regularized R\'enyi divergence minimization through Bregman proximal gradient algorithms

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6416656)