Pages that link to "Item:Q1659678"
From MaRDI portal
The following pages link to New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure (Q1659678):
Displaying 8 items.
- Convergence rates of subgradient methods for quasi-convex optimization problems (Q782917) (← links)
- Accelerated first-order methods for hyperbolic programming (Q1717219) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- Sharpness, Restart, and Acceleration (Q5210521) (← links)
- ``Efficient” Subgradient Methods for General Convex Optimization (Q5506689) (← links)
- Faster first-order primal-dual methods for linear programming using restarts and sharpness (Q6165583) (← links)
- Stochastic algorithms with geometric step decay converge linearly on sharp functions (Q6608032) (← links)
- Perseus: a simple and optimal high-order method for variational inequalities (Q6665392) (← links)