Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

From MaRDI portal
Publication:6400815

arXiv2206.00508MaRDI QIDQ6400815FDOQ6400815


Authors: Lukang Sun, Avetik Karagulyan, Peter Richtárik Edit this on Wikidata


Publication date: 1 June 2022

Abstract: Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form pi(x)proptoexp(V(x)). In the existing theory of Langevin-type algorithms and SVGD, the potential function V is often assumed to be L-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than 2. Our paper studies the convergence of the SVGD algorithm for distributions with (L0,L1)-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. [2019a] for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the mathrmKL divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.













This page was built for publication: Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6400815)