Analysis and design of optimization algorithms via integral quadratic constraints
From MaRDI portal
Publication:3465237
Abstract: This manuscript develops a new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory. IQCs provide sufficient conditions for the stability of complicated interconnected systems, and these conditions can be checked by semidefinite programming. We discuss how to adapt IQC theory to study optimization algorithms, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Using these inequalities, we derive numerical upper bounds on convergence rates for the gradient method, the heavy-ball method, Nesterov's accelerated method, and related variants by solving small, simple semidefinite programming problems. We also briefly show how these techniques can be used to search for optimization algorithms with desired performance characteristics, establishing a new methodology for algorithm design.
Recommendations
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- Convex Synthesis of Accelerated Gradient Algorithms
- Analysis of optimization algorithms via sum-of-squares
Cites work
- scientific article; zbMATH DE number 427959 (Why is no real title available?)
- scientific article; zbMATH DE number 432503 (Why is no real title available?)
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 47864 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3249151 (Why is no real title available?)
- scientific article; zbMATH DE number 3100566 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Absolute stability of nonlinear systems of automatic control
- Dissipative dynamical systems. I: General theory
- Dissipative dynamical systems. II: Linear systems with quadratic supply rates
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Dualities in convex algebraic geometry
- Efficiency of coordinate descent methods on huge-scale optimization problems
- First-order methods of smooth convex optimization with inexact oracle
- Gradient methods for minimizing composite functions
- Graph implementations for nonsmooth convex programs
- Introductory lectures on convex optimization. A basic course.
- LYAPUNOV FUNCTIONS FOR THE PROBLEM OF LUR'E IN AUTOMATIC CONTROL
- Linear Matrix Inequalities in System and Control Theory
- Method of centers for minimizing generalized eigenvalues
- Nonconvex optimization problem: The infinite-horizon linear-quadratic control problem with quadratic constraints
- Nonlinear systems.
- Performance of first-order methods for smooth convex minimization: a novel approach
- Robust Stochastic Approximation Approach to Stochastic Programming
- Semidefinite programming relaxations and algebraic optimization in control
- Stability Analysis With Dissipation Inequalities and Integral Quadratic Constraints
- Stability Conditions for Systems with Monotone and Slope-Restricted Nonlinearities
- System analysis via integral quadratic constraints
- Templates for convex cone problems with applications to sparse signal recovery
- The complex structured singular value
- The long-step method of analytic centers for fractional problems
- Transient cool-down of a porous medium in pulsating flow
- Zames-Falb Multipliers for Quadratic Programming
Cited in
(only showing first 100 items - show all)- Regularized nonlinear acceleration
- Contractivity of Runge-Kutta methods for convex gradient systems
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Optimal step length for the maximal decrease of a self-concordant function by the Newton method
- Differentially Private Accelerated Optimization Algorithms
- Principled analyses and design of first-order methods with inexact proximal operators
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
- Convex Synthesis of Accelerated Gradient Algorithms
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization
- Passivity-based analysis of the ADMM algorithm for constraint-coupled optimization
- Lippmann-Schwinger solvers for the computational homogenization of materials with pores
- On the Barzilai-Borwein basic scheme in FFT-based computational homogenization
- scientific article; zbMATH DE number 7370534 (Why is no real title available?)
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- Convergence of first-order methods via the convex conjugate
- An adaptive Polyak heavy-ball method
- On the convergence analysis of aggregated heavy-ball method
- Explicit stabilised gradient descent for faster strongly convex optimisation
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- On the asymptotic linear convergence speed of Anderson acceleration, Nesterov acceleration, and nonlinear GMRES
- Higher-order power methods with momentum for solving the limiting probability distribution vector of higher-order Markov chains
- Analysis of a generalised expectation-maximisation algorithm for Gaussian mixture models: a control systems perspective
- Stability analysis by dynamic dissipation inequalities: on merging frequency-domain techniques with time-domain conditions
- On polarization-based schemes for the FFT-based computational homogenization of inelastic materials
- A review of nonlinear FFT-based computational homogenization methods
- A simple PID-based strategy for particle swarm optimization algorithm
- Heavy-ball-based optimal thresholding algorithms for sparse linear inverse problems
- Heavy-ball-based hard thresholding algorithms for sparse signal recovery
- Optimal deterministic algorithm generation
- A general system of differential equations to model first-order adaptive algorithms
- Nonlinear optimization filters for stochastic time-varying convex optimization
- Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
- Complexity analysis for optimization methods
- Uniting Nesterov and heavy ball methods for uniform global asymptotic stability of the set of minimizers
- Zames-Falb multipliers for absolute stability: from O'Shea's contribution to convex searches
- scientific article; zbMATH DE number 1424228 (Why is no real title available?)
- scientific article; zbMATH DE number 7626757 (Why is no real title available?)
- On lower and upper bounds in smooth and strongly convex optimization
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- Multiscale analysis of accelerated gradient methods
- Understanding a Class of Decentralized and Federated Optimization Algorithms: A Multirate Feedback Control Perspective
- Fast gradient method for low-rank matrix estimation
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- Generalizing the optimized gradient method for smooth convex minimization
- A regularization interpretation of the proximal point method for weakly convex functions
- A distributed accelerated optimization algorithm over time‐varying directed graphs with uncoordinated step‐sizes
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- scientific article; zbMATH DE number 7370630 (Why is no real title available?)
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- Exact worst-case performance of first-order methods for composite convex optimization
- An accelerated stochastic mirror descent method
- Search direction correction with normalized gradient makes first-order methods faster
- Robust accelerated gradient methods for smooth strongly convex functions
- Bounds for the tracking error of first-order online optimization methods
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Design of distributed and robust optimization algorithms. A systems theoretic approach
- On the necessity and sufficiency of discrete-time O'Shea-Zames-Falb multipliers
- Convergence rates of the heavy ball method for quasi-strongly convex optimization
- An optimal first order method based on optimal quadratic averaging
- Efficient first-order methods for convex minimization: a constructive approach
- A frequency-domain analysis of inexact gradient methods
- Analytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity condition
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The parameterized accelerated iteration method for solving the matrix equation \(AXB=C\)
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- scientific article; zbMATH DE number 7370590 (Why is no real title available?)
- Robustness analysis of uncertain discrete-time systems with dissipation inequalities and integral quadratic constraints
- A zeroing neural dynamics based acceleration optimization approach for optimizers in deep neural networks
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- On data-driven stabilization of systems with nonlinearities satisfying quadratic constraints
- Convergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functions
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
- Adaptive restart of the optimized gradient method for convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Analysis of optimization algorithms via sum-of-squares
- Computation of invariant sets for discrete‐time uncertain systems
- A fixed step distributed proximal gradient push‐pull algorithm based on integral quadratic constraint
- A control-theoretic perspective on optimal high-order optimization
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Convergence rates of proximal gradient methods via the convex conjugate
- scientific article; zbMATH DE number 7370620 (Why is no real title available?)
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Learning-based adaptive control with an accelerated iterative adaptive law
- From differential equation solvers to accelerated first-order methods for convex optimization
- A multivariate adaptive gradient algorithm with reduced tuning efforts
- Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis
- Projected dynamical systems on irregular, non-Euclidean domains for nonlinear optimization
- Connections between Georgiou and Smith's robust stability type theorems and the nonlinear small-gain theorems
- The connections between Lyapunov functions for some optimization algorithms and differential equations
- Bearing-only distributed localization: a unified barycentric approach
- Understanding the acceleration phenomenon via high-resolution differential equations
- A dynamical view of nonlinear conjugate gradient methods with applications to FFT-based computational micromechanics
- Specialized fast algorithms for IQC feasibility and optimization problems.
- Accelerated optimization landscape of linear-quadratic regulator
- An optimal gradient method for smooth strongly convex minimization
- Iterative pre-conditioning for expediting the distributed gradient-descent method: the case of linear least-squares problem
This page was built for publication: Analysis and design of optimization algorithms via integral quadratic constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3465237)