On stochastic accelerated gradient with convergence rate
From MaRDI portal
Publication:2111814
DOI10.1515/MATH-2022-0499OpenAlexW4312532619MaRDI QIDQ2111814FDOQ2111814
Xingxing Zha, Yiyuan Cheng, Yongquan Zhang
Publication date: 17 January 2023
Published in: Open Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1515/math-2022-0499
Analysis of algorithms and problem complexity (68Q25) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30) Descriptive complexity and finite models (68Q19)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Pegasos: primal estimated sub-gradient solver for SVM
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- Acceleration of Stochastic Approximation by Averaging
- A Stochastic Approximation Method
- Robust Stochastic Approximation Approach to Stochastic Programming
- Iteration-complexity of first-order penalty methods for convex programming
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An optimal method for stochastic composite optimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Incrementally updated gradient methods for constrained and regularized optimization
- Katyusha: the first direct acceleration of stochastic gradient methods
Cited In (9)
- Title not available (Why is that?)
- Learning rate adaptation in stochastic gradient descent.
- Convergence rates of accelerated proximal gradient algorithms under independent noise
- Accelerated gradient methods with absolute and relative noise in the gradient
- Improved complexities for stochastic conditional gradient methods under interpolation-like conditions
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- ACCELERATING GENERALIZED ITERATIVE SCALING BASED ON STAGGERED AITKEN METHOD FOR ON-LINE CONDITIONAL RANDOM FIELDS
- New Convergence Aspects of Stochastic Gradient Algorithms
- A new filter‐based stochastic gradient algorithm for dual‐rate ARX models
Uses Software
This page was built for publication: On stochastic accelerated gradient with convergence rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2111814)