Analysis of optimization algorithms via sum-of-squares
From MaRDI portal
Publication:2046552
Abstract: We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders them particularly relevant for applications in machine learning and large-scale data analysis. Relying on sum-of-squares (SOS) optimization, we introduce a hierarchy of semidefinite programs that give increasingly better convergence bounds for higher levels of the hierarchy. Alluding to the power of the SOS hierarchy, we show that the (dual of the) first level corresponds to the Performance Estimation Problem (PEP) introduced by Drori and Teboulle [Math. Program., 145(1):451--482, 2014], a powerful framework for determining convergence rates of first-order optimization algorithms. Consequently, many results obtained within the PEP framework can be reinterpreted as degree-1 SOS proofs, and thus, the SOS framework provides a promising new approach for certifying improved rates of convergence by means of higher-order SOS certificates. To determine analytical rate bounds, in this work we use the first level of the SOS hierarchy and derive new result{s} for noisy gradient descent with inexact line search methods (Armijo, Wolfe, and Goldstein).
Recommendations
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Analysis and design of optimization algorithms via integral quadratic constraints
- Performance of first-order methods for smooth convex minimization: a novel approach
- DSOS and SDSOS optimization: more tractable alternatives to sum of squares and semidefinite optimization
- Sum-of-squares optimization without semidefinite programming
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 527343 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Sum of Squares Approximation of Nonnegative Polynomials
- An optimal variant of Kelley's cutting-plane method
- Analysis and design of optimization algorithms via integral quadratic constraints
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Efficient first-order methods for convex minimization: a constructive approach
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Exact worst-case performance of first-order methods for composite convex optimization
- Graph implementations for nonsmooth convex programs
- Linear and nonlinear programming
- On the convergence rate of the Halpern-iteration
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Optimized first-order methods for smooth convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
- Polynomial optimization, sums of squares, and applications
- Rate of Convergence of Several Conjugate Gradient Algorithms
- SDPT3 — A Matlab software package for semidefinite programming, Version 1.3
- Semidefinite Optimization and Convex Algebraic Geometry
- Semidefinite programming relaxations for semialgebraic problems
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Solving semidefinite-quadratic-linear programs using SDPT3
- Sums of squares, moment matrices and optimization over polynomials
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
Cited in
(3)- Analysis and design of optimization algorithms via integral quadratic constraints
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
This page was built for publication: Analysis of optimization algorithms via sum-of-squares
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2046552)