Perturbed iterate analysis for asynchronous stochastic optimization
DOI10.1137/16M1057000zbMATH Open1376.65096arXiv1507.06970OpenAlexW2962952793MaRDI QIDQ4588862FDOQ4588862
Authors: Horia Mania, Xinghao Pan, Dimitris S. Papailiopoulos, Benjamin Recht, Kannan Ramchandran, Michael Jordan
Publication date: 3 November 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.06970
Recommendations
- Improved asynchronous parallel optimization analysis for stochastic incremental methods
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
convergencenumerical examplesstochastic optimizationasynchronous algorithmsparallel machine learningsparse stochastic variance-reduced gradient algorithm
Numerical mathematical programming methods (65K05) Learning and adaptive systems in artificial intelligence (68T05) Parallel numerical computation (65Y05) Stochastic programming (90C15)
Cites Work
- ARock: an algorithmic framework for asynchronous parallel coordinate updates
- Title not available (Why is that?)
- Parallel coordinate descent methods for big data optimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Beyond the regret minimization barrier: optimal algorithms for stochastic strongly-convex optimization
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Convex optimization: algorithms and complexity
- Chaotic relaxation
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- Revisiting Asynchronous Linear Solvers
- An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
Cited In (15)
- On the convergence analysis of aggregated heavy-ball method
- Title not available (Why is that?)
- Parallel stochastic asynchronous coordinate descent: tight bounds on the possible parallelism
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- Parallel and distributed asynchronous adaptive stochastic gradient methods
- Asynchronous parallel algorithms for nonconvex optimization
- Nonlinear Gradient Mappings and Stochastic Optimization: A General Framework with Applications to Heavy-Tail Noise
- A robust multi-batch L-BFGS method for machine learning
- Title not available (Why is that?)
- Applications of Fokker Planck equations in machine learning algorithms
- Incremental without replacement sampling in nonconvex optimization
- Improved asynchronous parallel optimization analysis for stochastic incremental methods
- Distributed stochastic optimization with large delays
- On the convergence analysis of asynchronous SGD for solving consistent linear systems
- Delay and cooperation in nonstochastic bandits
Uses Software
This page was built for publication: Perturbed iterate analysis for asynchronous stochastic optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4588862)