The following pages link to SpiderBoost (Q5973151):
Displayed 15 items.
- Accelerating variance-reduced stochastic gradient methods (Q2118092) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Finite-sum smooth optimization with SARAH (Q2149950) (← links)
- A linearly convergent stochastic recursive gradient method for convex optimization (Q2228399) (← links)
- Reproducing kernels: harmonic analysis and some of their applications (Q2659761) (← links)
- (Q4969167) (← links)
- (Q4969178) (← links)
- MultiLevel Composite Stochastic Optimization via Nested Variance Reduction (Q4987278) (← links)
- A Stochastic Proximal Alternating Minimization for Nonsmooth and Nonconvex Optimization (Q5024392) (← links)
- Fast Decentralized Nonconvex Finite-Sum Optimization with Recursive Variance Reduction (Q5026835) (← links)
- Some Limit Properties of Markov Chains Induced by Recursive Stochastic Algorithms (Q5037552) (← links)
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Q5076671) (← links)
- DESTRESS: Computation-Optimal and Communication-Efficient Decentralized Nonconvex Finite-Sum Optimization (Q5095229) (← links)
- Inexact SARAH algorithm for stochastic optimization (Q5859016) (← links)