Finite-sum smooth optimization with SARAH
From MaRDI portal
Publication:2149950
DOI10.1007/s10589-022-00375-xzbMath1494.90087arXiv1901.07648OpenAlexW2942057781MaRDI QIDQ2149950
Tsui-Wei Weng, Jayant R. Kalagnanam, Marten van Dijk, Dzung T. Phan, Lam M. Nguyen, Phuong Ha Nguyen
Publication date: 27 June 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1901.07648
Related Items
A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization ⋮ DESTRESS: Computation-Optimal and Communication-Efficient Decentralized Nonconvex Finite-Sum Optimization ⋮ A Convergence Study of SGD-Type Methods for Stochastic Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Minimizing finite sums with the stochastic average gradient
- Introductory lectures on convex optimization. A basic course.
- Optimization Methods for Large-Scale Machine Learning
- New Convergence Aspects of Stochastic Gradient Algorithms
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- A Stochastic Approximation Method
- Inexact SARAH algorithm for stochastic optimization