Analysis and design of optimization algorithms via integral quadratic constraints
From MaRDI portal
Publication:3465237
Abstract: This manuscript develops a new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory. IQCs provide sufficient conditions for the stability of complicated interconnected systems, and these conditions can be checked by semidefinite programming. We discuss how to adapt IQC theory to study optimization algorithms, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Using these inequalities, we derive numerical upper bounds on convergence rates for the gradient method, the heavy-ball method, Nesterov's accelerated method, and related variants by solving small, simple semidefinite programming problems. We also briefly show how these techniques can be used to search for optimization algorithms with desired performance characteristics, establishing a new methodology for algorithm design.
Recommendations
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- Convex Synthesis of Accelerated Gradient Algorithms
- Analysis of optimization algorithms via sum-of-squares
Cites work
- scientific article; zbMATH DE number 427959 (Why is no real title available?)
- scientific article; zbMATH DE number 432503 (Why is no real title available?)
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 47864 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3249151 (Why is no real title available?)
- scientific article; zbMATH DE number 3100566 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Absolute stability of nonlinear systems of automatic control
- Dissipative dynamical systems. I: General theory
- Dissipative dynamical systems. II: Linear systems with quadratic supply rates
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Dualities in convex algebraic geometry
- Efficiency of coordinate descent methods on huge-scale optimization problems
- First-order methods of smooth convex optimization with inexact oracle
- Gradient methods for minimizing composite functions
- Graph implementations for nonsmooth convex programs
- Introductory lectures on convex optimization. A basic course.
- LYAPUNOV FUNCTIONS FOR THE PROBLEM OF LUR'E IN AUTOMATIC CONTROL
- Linear Matrix Inequalities in System and Control Theory
- Method of centers for minimizing generalized eigenvalues
- Nonconvex optimization problem: The infinite-horizon linear-quadratic control problem with quadratic constraints
- Nonlinear systems.
- Performance of first-order methods for smooth convex minimization: a novel approach
- Robust Stochastic Approximation Approach to Stochastic Programming
- Semidefinite programming relaxations and algebraic optimization in control
- Stability Analysis With Dissipation Inequalities and Integral Quadratic Constraints
- Stability Conditions for Systems with Monotone and Slope-Restricted Nonlinearities
- System analysis via integral quadratic constraints
- Templates for convex cone problems with applications to sparse signal recovery
- The complex structured singular value
- The long-step method of analytic centers for fractional problems
- Transient cool-down of a porous medium in pulsating flow
- Zames-Falb Multipliers for Quadratic Programming
Cited in
(only showing first 100 items - show all)- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Complexity analysis for optimization methods
- Convex Synthesis of Accelerated Gradient Algorithms
- The connections between Lyapunov functions for some optimization algorithms and differential equations
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
- Perturbed Fenchel duality and first-order methods
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- An optimal first order method based on optimal quadratic averaging
- Stability analysis by dynamic dissipation inequalities: on merging frequency-domain techniques with time-domain conditions
- Design of distributed and robust optimization algorithms. A systems theoretic approach
- Generalizing the optimized gradient method for smooth convex minimization
- Analytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity condition
- On polarization-based schemes for the FFT-based computational homogenization of inelastic materials
- Regularized nonlinear acceleration
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Learning-based adaptive control with an accelerated iterative adaptive law
- On lower and upper bounds in smooth and strongly convex optimization
- Specialized fast algorithms for IQC feasibility and optimization problems.
- Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
- Exact worst-case performance of first-order methods for composite convex optimization
- Zames-Falb multipliers for absolute stability: from O'Shea's contribution to convex searches
- scientific article; zbMATH DE number 7370590 (Why is no real title available?)
- A dynamical view of nonlinear conjugate gradient methods with applications to FFT-based computational micromechanics
- A control-theoretic perspective on optimal high-order optimization
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Bounds for the tracking error of first-order online optimization methods
- Convergence of first-order methods via the convex conjugate
- scientific article; zbMATH DE number 7370534 (Why is no real title available?)
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- An introduction to continuous optimization for imaging
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- A review of nonlinear FFT-based computational homogenization methods
- Convergence rates of the heavy ball method for quasi-strongly convex optimization
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
- Optimal deterministic algorithm generation
- Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- A regularization interpretation of the proximal point method for weakly convex functions
- From differential equation solvers to accelerated first-order methods for convex optimization
- Robust accelerated gradient methods for smooth strongly convex functions
- Iterative pre-conditioning for expediting the distributed gradient-descent method: the case of linear least-squares problem
- Convergence rates of proximal gradient methods via the convex conjugate
- scientific article; zbMATH DE number 1424228 (Why is no real title available?)
- Contractivity of Runge-Kutta methods for convex gradient systems
- scientific article; zbMATH DE number 7626757 (Why is no real title available?)
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- An adaptive Polyak heavy-ball method
- On the convergence analysis of aggregated heavy-ball method
- Projected dynamical systems on irregular, non-Euclidean domains for nonlinear optimization
- Bearing-only distributed localization: a unified barycentric approach
- An optimal gradient method for smooth strongly convex minimization
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Explicit stabilised gradient descent for faster strongly convex optimisation
- Passivity-based analysis of the ADMM algorithm for constraint-coupled optimization
- Efficient first-order methods for convex minimization: a constructive approach
- scientific article; zbMATH DE number 7370620 (Why is no real title available?)
- On the convergence analysis of the optimized gradient method
- The common-directions method for regularized empirical risk minimization
- Adaptive restart of the optimized gradient method for convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- A frequency-domain analysis of inexact gradient methods
- Analysis of optimization algorithms via sum-of-squares
- Understanding the acceleration phenomenon via high-resolution differential equations
- scientific article; zbMATH DE number 7370630 (Why is no real title available?)
- Uniting Nesterov and heavy ball methods for uniform global asymptotic stability of the set of minimizers
- Higher-order power methods with momentum for solving the limiting probability distribution vector of higher-order Markov chains
- Analysis of a generalised expectation-maximisation algorithm for Gaussian mixture models: a control systems perspective
- Lippmann-Schwinger solvers for the computational homogenization of materials with pores
- On the Barzilai-Borwein basic scheme in FFT-based computational homogenization
- Zames–Falb multipliers for convergence rate: motivating example and convex searches
- Accelerated optimization landscape of linear-quadratic regulator
- On the necessity and sufficiency of discrete-time O'Shea-Zames-Falb multipliers
- On data-driven stabilization of systems with nonlinearities satisfying quadratic constraints
- Principled analyses and design of first-order methods with inexact proximal operators
- An accelerated stochastic mirror descent method
- A multivariate adaptive gradient algorithm with reduced tuning efforts
- Understanding a Class of Decentralized and Federated Optimization Algorithms: A Multirate Feedback Control Perspective
- The parameterized accelerated iteration method for solving the matrix equation \(AXB=C\)
- Heavy-ball-based optimal thresholding algorithms for sparse linear inverse problems
- Heavy-ball-based hard thresholding algorithms for sparse signal recovery
- Fast symplectic integrator for Nesterov-type acceleration method
- Multiscale analysis of accelerated gradient methods
- Differentially Private Accelerated Optimization Algorithms
- Nonlinear optimization filters for stochastic time-varying convex optimization
- A general system of differential equations to model first-order adaptive algorithms
- A simple PID-based strategy for particle swarm optimization algorithm
- Fast gradient method for low-rank matrix estimation
- Computation of invariant sets for discrete‐time uncertain systems
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Convergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functions
- A fixed step distributed proximal gradient push‐pull algorithm based on integral quadratic constraint
- Connections between Georgiou and Smith's robust stability type theorems and the nonlinear small-gain theorems
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization
- Robustness analysis of uncertain discrete-time systems with dissipation inequalities and integral quadratic constraints
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- Mini-workshop: Analysis of data-driven optimal control. Abstracts from the mini-workshop held May 9--15, 2021 (hybrid meeting)
- Automated tight Lyapunov analysis for first-order methods
This page was built for publication: Analysis and design of optimization algorithms via integral quadratic constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3465237)