On the convergence of asynchronous parallel iteration with unbounded delays
From MaRDI portal
Publication:2422607
DOI10.1007/s40305-017-0183-1zbMath1424.65086arXiv1612.04425OpenAlexW2963337970MaRDI QIDQ2422607
Zhimin Peng, Ming Yan, Wotao Yin, Yang-yang Xu
Publication date: 20 June 2019
Published in: Journal of the Operations Research Society of China (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1612.04425
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Parallel numerical computation (65Y05)
Related Items
Parallel and distributed asynchronous adaptive stochastic gradient methods, Asynchronous parallel algorithms for nonconvex optimization, Markov chain block coordinate descent, Distributed Stochastic Inertial-Accelerated Methods with Delayed Derivatives for Nonconvex Problems
Uses Software
Cites Work
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- On asynchronous iterations
- A probabilistic analysis of asynchronous iteration
- On unbounded delays in asynchronous parallel fixed-point algorithms
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Support-vector networks
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Asynchronous parallel algorithms for nonconvex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Chaotic relaxation
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates
- Augmented $\ell_1$ and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Distributed asynchronous computation of fixed points
- Variational Analysis
- Partially Asynchronous, Parallel Algorithms for Network Flow and Other Problems
- Tensor Regression with Applications in Neuroimaging Data Analysis
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- A case study in programming for parallel-processors
- Probability: A Graduate Course
- Convergence of a block coordinate descent method for nondifferentiable minimization