Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs

From MaRDI portal
Publication:2979341

DOI10.1109/TAC.2016.2529285zbMATH Open1359.90142arXiv1406.2075OpenAlexW2963156126MaRDI QIDQ2979341FDOQ2979341


Authors: Angelia Nedić, Alex Olshevsky Edit this on Wikidata


Publication date: 3 May 2017

Published in: IEEE Transactions on Automatic Control (Search for Journal in Brave)

Abstract: We investigate the convergence rate of the recently proposed subgradient-push method for distributed optimization over time-varying directed graphs. The subgradient-push method can be implemented in a distributed way without requiring knowledge of either the number of agents or the graph sequence; each node is only required to know its out-degree at each time. Our main result is a convergence rate of Oleft((lnt)/tight) for strongly convex functions with Lipschitz gradients even if only stochastic gradient samples are available; this is asymptotically faster than the Oleft((lnt)/sqrttight) rate previously known for (general) convex functions.


Full work available at URL: https://arxiv.org/abs/1406.2075







Cited In (35)





This page was built for publication: Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2979341)