On stochastic accelerated gradient with convergence rate
From MaRDI portal
Publication:2111814
DOI10.1515/math-2022-0499MaRDI QIDQ2111814
Xingxing Zha, Yiyuan Cheng, Yongquan Zhang
Publication date: 17 January 2023
Published in: Open Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1515/math-2022-0499
convergence rate; logistic regression; least-square regression; accelerated stochastic approximation
68Q25: Analysis of algorithms and problem complexity
68Q30: Algorithmic information theory (Kolmogorov complexity, etc.)
68Q19: Descriptive complexity and finite models
Uses Software
Cites Work
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Gradient methods for minimizing composite functions
- An optimal method for stochastic composite optimization
- Pegasos: primal estimated sub-gradient solver for SVM
- Iteration-complexity of first-order penalty methods for convex programming
- Incrementally updated gradient methods for constrained and regularized optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Robust Stochastic Approximation Approach to Stochastic Programming
- Acceleration of Stochastic Approximation by Averaging
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Katyusha: the first direct acceleration of stochastic gradient methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Stochastic Approximation Method
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization