An Asynchronous Parallel Stochastic Coordinate Descent Algorithm

From MaRDI portal
Publication:5502117


zbMath1337.68286arXiv1311.1873MaRDI QIDQ5502117

Victor Bittorf, Stephen J. Wright, Srikrishna Sridhar, Christopher Re, Ji Liu

Publication date: 17 August 2015

Full work available at URL: https://arxiv.org/abs/1311.1873


90C25: Convex programming

68W10: Parallel algorithms in computer science


Related Items

Unnamed Item, Perturbed Iterate Analysis for Asynchronous Stochastic Optimization, Decentralized Consensus Algorithm with Delayed and Stochastic Gradients, Optimization Methods for Large-Scale Machine Learning, Improved asynchronous parallel optimization analysis for stochastic incremental methods, Unnamed Item, Unnamed Item, Distributed Learning with Sparse Communications by Identification, A class of smooth exact penalty function methods for optimization problems with orthogonality constraints, An asynchronous inertial algorithm for solving convex feasibility problems with strict pseudo-contractions in Hilbert spaces, On the rates of convergence of parallelized averaged stochastic gradient algorithms, Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism, Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version, Parallelizable Algorithms for Optimization Problems with Orthogonality Constraints, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Accelerate stochastic subgradient method by leveraging local growth condition, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning, A generic coordinate descent solver for non-smooth convex optimisation, Cyclic Coordinate Dual Averaging with Extrapolation, Parallel and distributed asynchronous adaptive stochastic gradient methods, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization, On unbounded delays in asynchronous parallel fixed-point algorithms, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training, A distributed quantile estimation algorithm of heavy-tailed distribution with massive datasets, Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches, Asynchronous parallel algorithms for nonconvex optimization, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, Asynchronous Lagrangian scenario decomposition, Coordinate descent algorithms, Linear convergence of first order methods for non-strongly convex optimization, A framework for parallel second order incremental optimization algorithms for solving partially separable problems, On the convergence of asynchronous parallel iteration with unbounded delays, A linearly convergent doubly stochastic Gauss-Seidel algorithm for solving linear equations and a certain class of over-parameterized optimization problems, Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry, Variance reduction for root-finding problems, On the parallelization upper bound for asynchronous stochastic gradients descent in non-convex optimization, ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties, Accelerated, Parallel, and Proximal Coordinate Descent, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Distributed Block Coordinate Descent for Minimizing Partially Separable Functions


Uses Software