Perturbed iterate analysis for asynchronous stochastic optimization

From MaRDI portal
Publication:4588862

DOI10.1137/16M1057000zbMATH Open1376.65096arXiv1507.06970OpenAlexW2962952793MaRDI QIDQ4588862FDOQ4588862


Authors: Horia Mania, Xinghao Pan, Dimitris S. Papailiopoulos, Benjamin Recht, Kannan Ramchandran, Michael Jordan Edit this on Wikidata


Publication date: 3 November 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: We introduce and analyze stochastic optimization methods where the input to each gradient update is perturbed by bounded noise. We show that this framework forms the basis of a unified approach to analyze asynchronous implementations of stochastic optimization algorithms.In this framework, asynchronous stochastic optimization algorithms can be thought of as serial methods operating on noisy inputs. Using our perturbed iterate framework, we provide new analyses of the Hogwild! algorithm and asynchronous stochastic coordinate descent, that are simpler than earlier analyses, remove many assumptions of previous models, and in some cases yield improved upper bounds on the convergence rates. We proceed to apply our framework to develop and analyze KroMagnon: a novel, parallel, sparse stochastic variance-reduced gradient (SVRG) algorithm. We demonstrate experimentally on a 16-core machine that the sparse and parallel version of SVRG is in some cases more than four orders of magnitude faster than the standard SVRG algorithm.


Full work available at URL: https://arxiv.org/abs/1507.06970




Recommendations




Cites Work


Cited In (15)

Uses Software





This page was built for publication: Perturbed iterate analysis for asynchronous stochastic optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4588862)