Linear convergence of cyclic SAGA
From MaRDI portal
Publication:2193004
DOI10.1007/s11590-019-01520-yzbMath1450.90030arXiv1810.11167OpenAlexW2996955267WikidataQ126396270 ScholiaQ126396270MaRDI QIDQ2193004
Publication date: 24 August 2020
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.11167
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimizing finite sums with the stochastic average gradient
- Incremental proximal methods for large scale convex optimization
- Introductory lectures on convex optimization. A basic course.
- An optimal randomized incremental gradient method
- Incrementally updated gradient methods for constrained and regularized optimization
- Incremental constraint projection methods for variational inequalities
- Variance-Reduced Stochastic Learning by Networked Agents Under Random Reshuffling
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- A Convergent Incremental Gradient Method with a Constant Step Size
- A Stochastic Approximation Method
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Linear convergence of cyclic SAGA