Universal intermediate gradient method for convex problems with inexact oracle

From MaRDI portal
Publication:5865342

DOI10.1080/10556788.2019.1711079zbMATH Open1493.90132arXiv1712.06036OpenAlexW2998975330WikidataQ126332543 ScholiaQ126332543MaRDI QIDQ5865342FDOQ5865342


Authors: Dmitry Kamzolov, Pavel Dvurechensky, Alexander V. Gasnikov Edit this on Wikidata


Publication date: 13 June 2022

Published in: Optimization Methods \& Software (Search for Journal in Brave)

Abstract: In this paper, we propose new first-order methods for minimization of a convex function on a simple convex set. We assume that the objective function is a composite function given as a sum of a simple convex function and a convex function with inexact H"older-continuous subgradient. We propose Universal Intermediate Gradient Method. Our method enjoys both the universality and intermediateness properties. Following the paper by Y. Nesterov (Math.Prog., 2015) on Universal Gradient Methods, our method does not require any information about the H"older parameter and constant and adjusts itself automatically to the local level of smoothness. On the other hand, in the spirit of the preprint by O. Devolder, F.Glineur, and Y. Nesterov (CORE DP 2013/17), our method is intermediate in the sense that it interpolates between Universal Gradient Method and Universal Fast Gradient Method. This allows to balance the rate of convergence of the method and rate of the oracle error accumulation. Under additional assumption of strong convexity of the objective, we show how the restart technique can be used to obtain an algorithm with faster rate of convergence.


Full work available at URL: https://arxiv.org/abs/1712.06036




Recommendations




Cites Work


Cited In (9)

Uses Software





This page was built for publication: Universal intermediate gradient method for convex problems with inexact oracle

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5865342)