Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
From MaRDI portal
Publication:5153449
Recommendations
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Optimized first-order methods for smooth convex minimization
- Generalizing the optimized gradient method for smooth convex minimization
- Optimizing the efficiency of first-order methods for the distance to an optimal solution of smooth convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
Cited in
(10)- First-Order Methods for Nonconvex Quadratic Minimization
- Generalizing the optimized gradient method for smooth convex minimization
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Optimized first-order methods for smooth convex minimization
- An optimal first order method based on optimal quadratic averaging
- An acceleration procedure for optimal first-order methods
- Optimizing the efficiency of first-order methods for the distance to an optimal solution of smooth convex functions
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
This page was built for publication: Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5153449)