On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
From MaRDI portal
Publication:1679617
DOI10.1007/s11590-016-1087-4zbMath1381.90067arXiv1606.09365OpenAlexW2473376220WikidataQ59512949 ScholiaQ59512949MaRDI QIDQ1679617
Etienne de Klerk, François Glineur, Adrien B. Taylor
Publication date: 9 November 2017
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.09365
Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Methods of reduced gradient type (90C52)
Related Items
A frequency-domain analysis of inexact gradient methods, Generalizing the Optimized Gradient Method for Smooth Convex Minimization, Exact worst-case convergence rates of the proximal gradient method for composite convex minimization, Primal–dual accelerated gradient methods with small-dimensional relaxation oracle, A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions, The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions, An optimal gradient method for smooth strongly convex minimization, Conditions for linear convergence of the gradient method for non-convex optimization, Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods, Principled analyses and design of first-order methods with inexact proximal operators, Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022, Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation, Efficient first-order methods for convex minimization: a constructive approach, Adaptive Catalyst for Smooth Convex Optimization, New stepsizes for the gradient method, Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems, Analysis of biased stochastic gradient descent using sequential semidefinite programs, Analysis of optimization algorithms via sum-of-squares, Optimal step length for the Newton method: case of self-concordant functions, Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
Uses Software
Cites Work
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Linear and nonlinear programming.
- Introductory lectures on convex optimization. A basic course.
- Performance of first-order methods for smooth convex minimization: a novel approach
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item