A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions
From MaRDI portal
Publication:2671453
DOI10.1007/s10957-022-02032-zzbMath1489.90128arXiv2106.08020OpenAlexW3173113055MaRDI QIDQ2671453
Bart Vandereycken, André Uschmajew
Publication date: 3 June 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.08020
Cites Work
- Unnamed Item
- Unnamed Item
- Stepsize analysis for descent methods
- Introductory lectures on convex optimization. A basic course.
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Measures for Symmetric Rank-One Updates
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
This page was built for publication: A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions