Primal-dual accelerated gradient methods with small-dimensional relaxation oracle

From MaRDI portal
Publication:5085262

DOI10.1080/10556788.2020.1731747zbMATH Open1489.90124arXiv1809.05895OpenAlexW3009707357MaRDI QIDQ5085262FDOQ5085262


Authors: Sergey Guminov, Pavel Dvurechensky, Yuri Nesterov, Alexander V. Gasnikov Edit this on Wikidata


Publication date: 27 June 2022

Published in: Optimization Methods \& Software (Search for Journal in Brave)

Abstract: In this paper, a new variant of accelerated gradient descent is proposed. The pro-posed method does not require any information about the objective function, usesexact line search for the practical accelerations of convergence, converges accordingto the well-known lower bounds for both convex and non-convex objective functions,possesses primal-dual properties and can be applied in the non-euclidian set-up. Asfar as we know this is the rst such method possessing all of the above properties atthe same time. We also present a universal version of the method which is applicableto non-smooth problems. We demonstrate how in practice one can efficiently use thecombination of line-search and primal-duality by considering a convex optimizationproblem with a simple structure (for example, linearly constrained).


Full work available at URL: https://arxiv.org/abs/1809.05895




Recommendations




Cites Work


Cited In (15)

Uses Software





This page was built for publication: Primal-dual accelerated gradient methods with small-dimensional relaxation oracle

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5085262)