Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

From MaRDI portal
Publication:5275297

DOI10.1137/16M108104XzbMath1371.90108arXiv1512.07516OpenAlexW2951335402MaRDI QIDQ5275297

Adrien B. Taylor, François Glineur, Julien M. Hendrickx

Publication date: 11 July 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1512.07516




Related Items (27)

Optimal complexity and certification of Bregman first-order methodsA frequency-domain analysis of inexact gradient methodsGeneralizing the Optimized Gradient Method for Smooth Convex MinimizationExact worst-case convergence rates of the proximal gradient method for composite convex minimizationPotential Function-Based Framework for Minimizing Gradients in Convex and Min-Max OptimizationOn the worst-case complexity of the gradient method with exact line search for smooth strongly convex functionsThe exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functionsAn optimal gradient method for smooth strongly convex minimizationFactor-\(\sqrt{2}\) acceleration of accelerated gradient methodsA Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex OptimizationBranch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methodsAn elementary approach to tight worst case complexity analysis of gradient based methodsPrincipled analyses and design of first-order methods with inexact proximal operatorsConic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance EstimationEfficient first-order methods for convex minimization: a constructive approachOperator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter SelectionTight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion ProblemsAccelerated proximal point method for maximally monotone operatorsOn the convergence analysis of the optimized gradient methodAnalysis of biased stochastic gradient descent using sequential semidefinite programsFast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested pointAnalysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functionsAnalysis of optimization algorithms via sum-of-squaresOptimal step length for the Newton method: case of self-concordant functionsRobust and structure exploiting optimisation algorithms: an integral quadratic constraint approach


Uses Software


Cites Work


This page was built for publication: Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization