Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
From MaRDI portal
Publication:2235160
DOI10.1007/s10107-020-01552-8zbMath1478.90056arXiv1811.03254OpenAlexW3084085807MaRDI QIDQ2235160
Yixin Tao, Yun Kuen Cheung, Richard John Cole
Publication date: 20 October 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.03254
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- On the complexity analysis of randomized block-coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Introductory lectures on convex optimization. A basic course.
- On asynchronous iterations
- Support-vector networks
- Coordinate descent algorithms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Chaotic relaxation
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Revisiting Asynchronous Linear Solvers
- The Group Lasso for Logistic Regression
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Asynchronous Iterative Methods for Multiprocessors
- Asynchronous Stochastic Approximations
- Gradient Convergence in Gradient methods with Errors
- Perturbed Iterate Analysis for Asynchronous Stochastic Optimization
- Improved asynchronous parallel optimization analysis for stochastic incremental methods
- Amortized Analysis of Asynchronous Price Dynamics
- Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism
- Tatonnement beyond gross substitutes?
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm