Faster subgradient methods for functions with Hölderian growth

From MaRDI portal
Publication:2297653

DOI10.1007/S10107-018-01361-0zbMATH Open1461.65161arXiv1704.00196OpenAlexW3104046271MaRDI QIDQ2297653FDOQ2297653


Authors: Patrick R. Johnstone, Pierre Moulin Edit this on Wikidata


Publication date: 20 February 2020

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: The purpose of this manuscript is to derive new convergence results for several subgradient methods applied to minimizing nonsmooth convex functions with H"olderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and weakly sharp minima as special cases. To this end there are three main contributions. First, for a constant and sufficiently small stepsize, we show that the subgradient method achieves linear convergence up to a certain region including the optimal set, with error of the order of the stepsize. Second, if appropriate problem parameters are known, we derive a decaying stepsize which obtains a much faster convergence rate than is suggested by the classical O(1/sqrtk) result for the subgradient method. Thirdly we develop a novel "descending stairs" stepsize which obtains this faster convergence rate and also obtains linear convergence for the special case of weakly sharp functions. We also develop an adaptive variant of the "descending stairs" stepsize which achieves the same convergence rate without requiring an error bound constant which is difficult to estimate in practice.


Full work available at URL: https://arxiv.org/abs/1704.00196




Recommendations



Cites Work


Cited In (14)





This page was built for publication: Faster subgradient methods for functions with Hölderian growth

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2297653)