Provably faster gradient descent via long steps
From MaRDI portal
Publication:6579999
DOI10.1137/23M1588408MaRDI QIDQ6579999FDOQ6579999
Publication date: 29 July 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
semidefinite programmingconvergence ratessmooth convex optimizationgradient descentperformance estimationcomputer-assisted analysis
Numerical mathematical programming methods (65K05) Convex programming (90C25) Semidefinite programming (90C22)
Cites Work
- Performance of first-order methods for smooth convex minimization: a novel approach
- Optimized first-order methods for smooth convex minimization
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Efficient first-order methods for convex minimization: a constructive approach
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- On Richardson's Method for Solving Linear Systems with Positive Definite Matrices
- Title not available (Why is that?)
- Ordering of the iterative parameters in the cyclical Chebyshev iterative method
- On the convergence rate of the Halpern-iteration
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection
- Exploiting low-rank structure in semidefinite programming by approximate operator splitting
- An optimal gradient method for smooth strongly convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Optimal Convergence Rates for the Proximal Bundle Method
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- A relaxed interior point method for low-rank semidefinite programming problems with applications to matrix completion
- Scalable Semidefinite Programming
- Accelerated proximal point method for maximally monotone operators
- Optimal complexity and certification of Bregman first-order methods
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems
- Implicit Regularity and Linear Convergence Rates for the Generalized Trust-Region Subproblem
- An elementary approach to tight worst case complexity analysis of gradient based methods
- An Optimal-Storage Approach to Semidefinite Programming Using Approximate Complementarity
- Revisiting Spectral Bundle Methods: Primal-Dual (Sub)linear Convergence Rates
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Principled analyses and design of first-order methods with inexact proximal operators
Cited In (1)
This page was built for publication: Provably faster gradient descent via long steps
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6579999)