Analysis of optimization algorithms via sum-of-squares
From MaRDI portal
Publication:2046552
DOI10.1007/S10957-021-01869-0zbMATH Open1475.90055arXiv1906.04648OpenAlexW3172446611MaRDI QIDQ2046552FDOQ2046552
Authors: Sandra S. Y. Tan, A. Varvitsiotis, Vincent Y. F. Tan
Publication date: 18 August 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Abstract: We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders them particularly relevant for applications in machine learning and large-scale data analysis. Relying on sum-of-squares (SOS) optimization, we introduce a hierarchy of semidefinite programs that give increasingly better convergence bounds for higher levels of the hierarchy. Alluding to the power of the SOS hierarchy, we show that the (dual of the) first level corresponds to the Performance Estimation Problem (PEP) introduced by Drori and Teboulle [Math. Program., 145(1):451--482, 2014], a powerful framework for determining convergence rates of first-order optimization algorithms. Consequently, many results obtained within the PEP framework can be reinterpreted as degree-1 SOS proofs, and thus, the SOS framework provides a promising new approach for certifying improved rates of convergence by means of higher-order SOS certificates. To determine analytical rate bounds, in this work we use the first level of the SOS hierarchy and derive new result{s} for noisy gradient descent with inexact line search methods (Armijo, Wolfe, and Goldstein).
Full work available at URL: https://arxiv.org/abs/1906.04648
Recommendations
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Analysis and design of optimization algorithms via integral quadratic constraints
- Performance of first-order methods for smooth convex minimization: a novel approach
- DSOS and SDSOS optimization: more tractable alternatives to sum of squares and semidefinite optimization
- Sum-of-squares optimization without semidefinite programming
Cites Work
- SDPT3 — A Matlab software package for semidefinite programming, Version 1.3
- Solving semidefinite-quadratic-linear programs using SDPT3
- Title not available (Why is that?)
- Title not available (Why is that?)
- Analysis and design of optimization algorithms via integral quadratic constraints
- Linear and nonlinear programming
- Graph implementations for nonsmooth convex programs
- Title not available (Why is that?)
- Semidefinite programming relaxations for semialgebraic problems
- Sums of squares, moment matrices and optimization over polynomials
- Title not available (Why is that?)
- Semidefinite Optimization and Convex Algebraic Geometry
- A Sum of Squares Approximation of Nonnegative Polynomials
- Performance of first-order methods for smooth convex minimization: a novel approach
- Optimized first-order methods for smooth convex minimization
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- An optimal variant of Kelley's cutting-plane method
- Rate of Convergence of Several Conjugate Gradient Algorithms
- Efficient first-order methods for convex minimization: a constructive approach
- Exact worst-case performance of first-order methods for composite convex optimization
- Polynomial optimization, sums of squares, and applications
- On the convergence rate of the Halpern-iteration
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
Cited In (3)
- Analysis and design of optimization algorithms via integral quadratic constraints
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
Uses Software
This page was built for publication: Analysis of optimization algorithms via sum-of-squares
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2046552)