Pages that link to "Item:Q4558142"
From MaRDI portal
The following pages link to RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142):
Displayed 15 items.
- Subgradient methods for sharp weakly convex functions (Q1626538) (← links)
- General convergence analysis of stochastic first-order methods for composite optimization (Q2032020) (← links)
- Randomized smoothing variance reduction method for large-scale non-smooth convex optimization (Q2033403) (← links)
- On finite termination of an inexact proximal point algorithm (Q2171164) (← links)
- Faster subgradient methods for functions with Hölderian growth (Q2297653) (← links)
- A simple nearly optimal restart scheme for speeding up first-order methods (Q2696573) (← links)
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds (Q2696991) (← links)
- The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient (Q3304387) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization (Q5162590) (← links)
- Accelerate stochastic subgradient method by leveraging local growth condition (Q5236746) (← links)
- The method of codifferential descent for convex and global piecewise affine optimization (Q5859002) (← links)
- Radial duality. II: Applications and algorithms (Q6126645) (← links)
- Faster first-order primal-dual methods for linear programming using restarts and sharpness (Q6165583) (← links)
- On optimal universal first-order methods for minimizing heterogeneous sums (Q6191975) (← links)