Convergence of Langevin Monte Carlo in Chi-Squared and Renyi Divergence

From MaRDI portal
Publication:6345649

arXiv2007.11612MaRDI QIDQ6345649FDOQ6345649


Authors: Murat A. Erdogdu, Rasa Hosseinzadeh, Matthew S. Zhang Edit this on Wikidata


Publication date: 22 July 2020

Abstract: We study sampling from a target distribution u=ef using the unadjusted Langevin Monte Carlo (LMC) algorithm when the potential f satisfies a strong dissipativity condition and it is first-order smooth with a Lipschitz gradient. We prove that, initialized with a Gaussian random vector that has sufficiently small variance, iterating the LMC algorithm for widetildemathcalO(lambda2depsilon1) steps is sufficient to reach epsilon-neighborhood of the target in both Chi-squared and Renyi divergence, where lambda is the logarithmic Sobolev constant of u. Our results do not require warm-start to deal with the exponential dimension dependency in Chi-squared divergence at initialization. In particular, for strongly convex and first-order smooth potentials, we show that the LMC algorithm achieves the rate estimate widetildemathcalO(depsilon1) which improves the previously known rates in both of these metrics, under the same assumptions. Translating this rate to other metrics, our results also recover the state-of-the-art rate estimates in KL divergence, total variation and 2-Wasserstein distance in the same setup. Finally, as we rely on the logarithmic Sobolev inequality, our framework covers a range of non-convex potentials that are first-order smooth and exhibit strong convexity outside of a compact region.













This page was built for publication: Convergence of Langevin Monte Carlo in Chi-Squared and Renyi Divergence

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6345649)