Parallel and distributed asynchronous adaptive stochastic gradient methods
From MaRDI portal
Publication:6095736
DOI10.1007/s12532-023-00237-5zbMath1519.90142arXiv2002.09095OpenAlexW4360980105MaRDI QIDQ6095736
Jie Chen, Colin Sutcher-Shepard, Yonggui Yan, Yang-yang Xu, Yibo Xu, Leopold Grinberg
Publication date: 8 September 2023
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.09095
Artificial neural networks and deep learning (68T07) Numerical mathematical programming methods (65K05) Stochastic programming (90C15) Parallel numerical computation (65Y05)
Cites Work
- Unnamed Item
- Some aspects of parallel and distributed iterative algorithms - a survey
- On the convergence of asynchronous parallel iteration with unbounded delays
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates
- An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Acceleration of Stochastic Approximation by Averaging
- Perturbed Iterate Analysis for Asynchronous Stochastic Optimization
- Improved asynchronous parallel optimization analysis for stochastic incremental methods
- Distributed Learning Systems with First-Order Methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- A Stochastic Approximation Method
- Asynchronous Gradient Push
This page was built for publication: Parallel and distributed asynchronous adaptive stochastic gradient methods