Pages that link to "Item:Q2934088"
From MaRDI portal
The following pages link to An optimal algorithm for stochastic strongly-convex optimization (Q2934088):
Displaying 30 items.
- Stochastic forward-backward splitting for monotone inclusions (Q289110) (← links)
- Nonparametric stochastic approximation with large step-sizes (Q309706) (← links)
- Minimizing finite sums with the stochastic average gradient (Q517295) (← links)
- Gradient-free two-point methods for solving stochastic nonsmooth convex optimization problems with small non-random noises (Q1616222) (← links)
- Optimal distributed stochastic mirror descent for strongly convex optimization (Q1640744) (← links)
- On variance reduction for stochastic smooth convex optimization with multiplicative noise (Q1739038) (← links)
- Convergence of stochastic proximal gradient algorithm (Q2019902) (← links)
- On strongly quasiconvex functions: existence results and proximal point algorithms (Q2116608) (← links)
- A modular analysis of adaptive (non-)convex optimization: optimism, composite objectives, variance reduction, and variational bounds (Q2290691) (← links)
- Exploiting problem structure in optimization under uncertainty via online convex optimization (Q2316616) (← links)
- Improving kernel online learning with a snapshot memory (Q2673322) (← links)
- Logarithmic regret in online linear quadratic control using Riccati updates (Q2674833) (← links)
- Relaxed-inertial proximal point type algorithms for quasiconvex minimization (Q2689858) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- (Q4558559) (← links)
- Perturbed Iterate Analysis for Asynchronous Stochastic Optimization (Q4588862) (← links)
- Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems (Q4965108) (← links)
- Making the Last Iterate of SGD Information Theoretically Optimal (Q4987277) (← links)
- (Q4998940) (← links)
- (Q5053317) (← links)
- (Q5054653) (← links)
- On the Adaptivity of Stochastic Gradient-Based Optimization (Q5114394) (← links)
- Technical Note—Nonstationary Stochastic Optimization Under <i>L</i><sub><i>p,q</i></sub>-Variation Measures (Q5129221) (← links)
- New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization (Q5162590) (← links)
- Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity (Q5231668) (← links)
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning (Q5254990) (← links)
- Sparse online regression algorithm with insensitive loss functions (Q6536701) (← links)
- Simple uncoupled no-regret learning dynamics for extensive-form correlated equilibrium (Q6551257) (← links)
- Accelerated zero-order SGD method for solving the black box optimization problem under ``overparametrization'' condition (Q6588732) (← links)
- Tight analyses for subgradient descent. I: Lower bounds (Q6633273) (← links)