The following pages link to (Q4558559):
Displaying 14 items.
- Accelerating variance-reduced stochastic gradient methods (Q2118092) (← links)
- Oracle complexity separation in convex optimization (Q2139268) (← links)
- Laplacian smoothing gradient descent (Q2168883) (← links)
- Optimization for deep learning: an overview (Q2218095) (← links)
- Accelerated directional search with non-Euclidean prox-structure (Q2290400) (← links)
- Accelerated stochastic variance reduction for a class of convex optimization problems (Q2696969) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- (Q5054622) (← links)
- An Adaptive Gradient Method with Energy and Momentum (Q5865911) (← links)
- An aggressive reduction on the complexity of optimization for non-strongly convex objectives (Q6052286) (← links)
- Nonconvex optimization with inertial proximal stochastic variance reduction gradient (Q6052662) (← links)
- Block mirror stochastic gradient method for stochastic optimization (Q6158991) (← links)
- Adaptive proximal SGD based on new estimating sequences for sparser ERM (Q6196471) (← links)
- SGEM: stochastic gradient with energy and momentum (Q6202786) (← links)