Geometrical Insights for Implicit Generative Modeling

From MaRDI portal
Publication:6162298

DOI10.1007/978-3-319-99492-5_11zbMATH Open1518.68290arXiv1712.07822OpenAlexW2962750481MaRDI QIDQ6162298FDOQ6162298

Maxime Oquab, Léon Bottou, David Lopez-Paz, Author name not available (Why is that?)

Publication date: 28 June 2023

Published in: Braverman Readings in Machine Learning. Key Ideas from Inception to Current State (Search for Journal in Brave)

Abstract: Learning algorithms for implicit generative models can optimize a variety of criteria that measure how the data distribution differs from the implicit model distribution, including the Wasserstein distance, the Energy distance, and the Maximum Mean Discrepancy criterion. A careful look at the geometries induced by these distances on the space of probability measures reveals interesting differences. In particular, we can establish surprising approximate global convergence guarantees for the 1-Wasserstein distance,even when the parametric generator has a nonconvex parametrization.


Full work available at URL: https://arxiv.org/abs/1712.07822




Recommendations



Cites Work


Cited In (6)





This page was built for publication: Geometrical Insights for Implicit Generative Modeling

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6162298)