Mirror descent and convex optimization problems with non-smooth inequality constraints

From MaRDI portal
Publication:2415205

DOI10.1007/978-3-319-97478-1_8zbMATH Open1421.90112arXiv1710.06612OpenAlexW2767083550MaRDI QIDQ2415205FDOQ2415205


Authors: Anastasia Bayandina, Pavel Dvurechensky, Alexander V. Gasnikov, F. S. Stonyakin, Alexander A. Titov Edit this on Wikidata


Publication date: 21 May 2019

Abstract: We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the objective and constraint. We hope that it is convenient for a reader to have all the methods for different settings in one place. Described methods are based on Mirror Descent algorithm and switching subgradient scheme. One of our focus is to propose, for the listed different settings, a Mirror Descent with adaptive stepsizes and adaptive stopping rule. This means that neither stepsize nor stopping rule require to know the Lipschitz constant of the objective or constraint. We also construct Mirror Descent for problems with objective function, which is not Lipschitz continuous, e.g. is a quadratic function. Besides that, we address the problem of recovering the solution of the dual problem.


Full work available at URL: https://arxiv.org/abs/1710.06612




Recommendations





Cited In (25)





This page was built for publication: Mirror descent and convex optimization problems with non-smooth inequality constraints

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2415205)