A mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize
From MaRDI portal
Publication:6097380
DOI10.1007/s40305-022-00436-2zbMath1524.90222MaRDI QIDQ6097380
Xin-Wei Liu, Tengteng Yu, Jie Sun, Yu-Hong Dai
Publication date: 5 June 2023
Published in: Journal of the Operations Research Society of China (Search for Journal in Brave)
Barzilai-Borwein methodcomposite optimizationproximal gradient algorithmstochastic recursive gradient
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90) Nonlinear programming (90C30)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An optimal method for stochastic composite optimization
- Minimizing finite sums with the stochastic average gradient
- The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth
- Inexact proximal stochastic gradient method for convex composite optimization
- Stochastic variance reduced gradient methods using a trust-region-like scheme
- Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization
- An inexact accelerated stochastic ADMM for separable convex optimization
- A linearly convergent stochastic recursive gradient method for convex optimization
- A family of spectral gradient methods for optimization
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Two-Point Step Size Gradient Methods
- First-Order Methods in Optimization
- Optimization Methods for Large-Scale Machine Learning
- Stochastic proximal quasi-Newton methods for non-convex composite optimization
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Understanding Machine Learning
- A Stochastic Approximation Method