Accelerating variance-reduced stochastic gradient methods
From MaRDI portal
Publication:2118092
DOI10.1007/S10107-020-01566-2zbMath1489.90113arXiv1910.09494OpenAlexW3084718985MaRDI QIDQ2118092
Matthias J. Ehrhardt, Derek Driggs, Carola-Bibiane Schönlieb
Publication date: 22 March 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1910.09494
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60) Stochastic programming (90C15)
Related Items (3)
An aggressive reduction on the complexity of optimization for non-strongly convex objectives ⋮ Nonconvex optimization with inertial proximal stochastic variance reduction gradient ⋮ An improvement of stochastic gradient descent approach for mean-variance portfolio optimization problem
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Minimizing finite sums with the stochastic average gradient
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- An optimal randomized incremental gradient method
- Exact matrix completion via convex optimization
- Robust principal component analysis?
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Optimization Methods for Large-Scale Machine Learning
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Signal Recovery by Proximal Forward-Backward Splitting
- A Stochastic Approximation Method
- SpiderBoost
This page was built for publication: Accelerating variance-reduced stochastic gradient methods