An elementary approach to tight worst case complexity analysis of gradient based methods
From MaRDI portal
Publication:6165581
DOI10.1007/s10107-022-01899-0zbMath1522.90105OpenAlexW4304957737MaRDI QIDQ6165581
Marc Teboulle, Yakov Vaisbourd
Publication date: 1 August 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-022-01899-0
convex minimizationgradient descentglobal rate of convergenceworst-case complexity analysiscomposite minimizationperformance estimation problemproximal schemes
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- A simplified view of first order methods for optimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
- Lagrangian methods for composite optimization
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- First-Order Methods in Optimization
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Proximité et dualité dans un espace hilbertien
- Convex programming in Hilbert space