On stochastic accelerated gradient with convergence rate
From MaRDI portal
Cites work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Stochastic Approximation Method
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Acceleration of Stochastic Approximation by Averaging
- Adaptive subgradient methods for online learning and stochastic optimization
- An optimal method for stochastic composite optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Gradient methods for minimizing composite functions
- Incrementally updated gradient methods for constrained and regularized optimization
- Iteration-complexity of first-order penalty methods for convex programming
- Katyusha: the first direct acceleration of stochastic gradient methods
- Large-scale machine learning with stochastic gradient descent
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Pegasos: primal estimated sub-gradient solver for SVM
- Robust Stochastic Approximation Approach to Stochastic Programming
- Smooth minimization of non-smooth functions
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
Cited in
(9)- Learning rate adaptation in stochastic gradient descent.
- scientific article; zbMATH DE number 5221408 (Why is no real title available?)
- Convergence rates of accelerated proximal gradient algorithms under independent noise
- Accelerated gradient methods with absolute and relative noise in the gradient
- Improved complexities for stochastic conditional gradient methods under interpolation-like conditions
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- ACCELERATING GENERALIZED ITERATIVE SCALING BASED ON STAGGERED AITKEN METHOD FOR ON-LINE CONDITIONAL RANDOM FIELDS
- New Convergence Aspects of Stochastic Gradient Algorithms
- A new filter‐based stochastic gradient algorithm for dual‐rate ARX models
This page was built for publication: On stochastic accelerated gradient with convergence rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2111814)