Minimizing finite sums with the stochastic average gradient
Publication:517295
DOI10.1007/s10107-016-1030-6zbMath1358.90073arXiv1309.2388OpenAlexW2963156201MaRDI QIDQ517295
Nicolas Le Roux, Mark Schmidt, Francis Bach
Publication date: 23 March 2017
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1309.2388
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15) Stochastic approximation (62L20)
Related Items (only showing first 100 items - show all)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- Pegasos: primal estimated sub-gradient solver for SVM
- A randomized Kaczmarz algorithm with exponential convergence
- Incremental gradient algorithms with stepsizes bounded away from zero
- Introductory lectures on convex optimization. A basic course.
- A Nonlinear GMRES Optimization Algorithm for Canonical Tensor Decomposition
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- An optimal algorithm for stochastic strongly-convex optimization
- Accelerated Stochastic Approximation
- Robust Stochastic Approximation Approach to Stochastic Programming
- Acceleration of Stochastic Approximation by Averaging
- Accelerated Stochastic Approximation
- A New Class of Incremental Gradient Methods for Least Squares Problems
- An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
- Semi-stochastic coordinate descent
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- A Convergent Incremental Gradient Method with a Constant Step Size
- A Stochastic Approximation Method
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
This page was built for publication: Minimizing finite sums with the stochastic average gradient