Accelerating incremental gradient optimization with curvature information
From MaRDI portal
Publication:2181597
DOI10.1007/s10589-020-00183-1zbMath1443.90276arXiv1806.00125OpenAlexW3009875021MaRDI QIDQ2181597
Anna Scaglione, César A. Uribe, Wei Shi, Hoi-To Wai, Angelia Nedić
Publication date: 19 May 2020
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.00125
Related Items
Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator, Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization, ciag, Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Minimizing finite sums with the stochastic average gradient
- An optimal randomized incremental gradient method
- A globally convergent incremental Newton method
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Non-asymptotic convergence analysis of inexact gradient methods for machine learning without strong convexity
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Optimization Methods for Large-Scale Machine Learning
- Convergence Rate of Incremental Gradient and Incremental Newton Methods
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- A Convergent Incremental Gradient Method with a Constant Step Size
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate
- A Stochastic Approximation Method