Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities

From MaRDI portal
Publication:6399398

arXiv2205.08446MaRDI QIDQ6399398FDOQ6399398


Authors: Eduard Gorbunov, Adrien B. Taylor, Gauthier Gidel Edit this on Wikidata


Publication date: 17 May 2022

Abstract: The Past Extragradient (PEG) [Popov, 1980] method, also known as the Optimistic Gradient method, has known a recent gain in interest in the optimization community with the emergence of variational inequality formulations for machine learning. Recently, in the unconstrained case, Golowich et al. [2020] proved that a O(1/N) last-iterate convergence rate in terms of the squared norm of the operator can be achieved for Lipschitz and monotone operators with a Lipschitz Jacobian. In this work, by introducing a novel analysis through potential functions, we show that (i) this O(1/N) last-iterate convergence can be achieved without any assumption on the Jacobian of the operator, and (ii) it can be extended to the constrained case, which was not derived before even under Lipschitzness of the Jacobian. The proof is significantly different from the one known from Golowich et al. [2020], and its discovery was computer-aided. Those results close the open question of the last iterate convergence of PEG for monotone variational inequalities.




Has companion code repository: https://github.com/eduardgorbunov/potentials_and_last_iter_convergence_for_vips









This page was built for publication: Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6399398)