Pages that link to "Item:Q2301128"
From MaRDI portal
The following pages link to Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128):
Displaying 12 items.
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions (Q2020604) (← links)
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization (Q2062324) (← links)
- A piecewise conservative method for unconstrained convex optimization (Q2070340) (← links)
- Restarting Frank-Wolfe: faster rates under Hölderian error bounds (Q2116603) (← links)
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (Q2133415) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Generalized self-concordant functions: a recipe for Newton-type methods (Q2330645) (← links)
- Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems (Q5870350) (← links)
- A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems (Q6145571) (← links)
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient (Q6168888) (← links)
- First-order methods for convex optimization (Q6169988) (← links)
- Coordinate descent methods beyond smoothness and separability (Q6498410) (← links)