Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints

From MaRDI portal
Revision as of 20:52, 4 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3465237

DOI10.1137/15M1009597zbMath1329.90103arXiv1408.3595MaRDI QIDQ3465237

Laurent Lessard, Benjamin Recht, Andrew K. Packard

Publication date: 21 January 2016

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1408.3595




Related Items (94)

Robustness analysis of uncertain discrete‐time systems with dissipation inequalities and integral quadratic constraintsZames-Falb multipliers for absolute stability: from O'Shea's contribution to convex searchesAnalysis of a generalised expectation–maximisation algorithm for Gaussian mixture models: a control systems perspectiveA frequency-domain analysis of inexact gradient methodsSynthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIsZames–Falb multipliers for convergence rate: motivating example and convex searchesUnnamed ItemDifferentially Private Accelerated Optimization AlgorithmsGeneralizing the Optimized Gradient Method for Smooth Convex MinimizationOptimal deterministic algorithm generationExplicit stabilised gradient descent for faster strongly convex optimisationAdaptive restart of the optimized gradient method for convex optimizationExact worst-case convergence rates of the proximal gradient method for composite convex minimizationProximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraintsAnalytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity conditionA regularization interpretation of the proximal point method for weakly convex functionsPotential Function-Based Framework for Minimizing Gradients in Convex and Min-Max OptimizationOn data-driven stabilization of systems with nonlinearities satisfying quadratic constraintsConvergence Rates of the Heavy Ball Method for Quasi-strongly Convex OptimizationAn optimal gradient method for smooth strongly convex minimizationA zeroing neural dynamics based acceleration optimization approach for optimizers in deep neural networksFactor-\(\sqrt{2}\) acceleration of accelerated gradient methodsA distributed accelerated optimization algorithm over time‐varying directed graphs with uncoordinated step‐sizesA fixed step distributed proximal gradient push‐pull algorithm based on integral quadratic constraintOn the necessity and sufficiency of discrete-time O'Shea-Zames-Falb multipliersFast gradient method for low-rank matrix estimationA Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex OptimizationBranch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methodsOptimal step length for the maximal decrease of a self-concordant function by the Newton methodUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemNo-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimizationHeavy-ball-based optimal thresholding algorithms for sparse linear inverse problemsHeavy-ball-based hard thresholding algorithms for sparse signal recoveryUniting Nesterov and heavy ball methods for uniform global asymptotic stability of the set of minimizersUnderstanding a Class of Decentralized and Federated Optimization Algorithms: A Multirate Feedback Control PerspectivePerturbed Fenchel duality and first-order methodsA forward-backward algorithm with different inertial terms for structured non-convex minimization problemsPrincipled analyses and design of first-order methods with inexact proximal operatorsConvergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functionsConic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022Computation of invariant sets for discrete‐time uncertain systemsAnother Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)An Optimal First Order Method Based on Optimal Quadratic AveragingContractivity of Runge--Kutta Methods for Convex Gradient SystemsWorst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance EstimationMini-workshop: Analysis of data-driven optimal control. Abstracts from the mini-workshop held May 9--15, 2021 (hybrid meeting)Efficient first-order methods for convex minimization: a constructive approachOperator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter SelectionMultiscale Analysis of Accelerated Gradient MethodsConvergence Rates of Proximal Gradient Methods via the Convex ConjugateConvergence of first-order methods via the convex conjugateA simple PID-based strategy for particle swarm optimization algorithmStability analysis by dynamic dissipation inequalities: on merging frequency-domain techniques with time-domain conditionsUnnamed ItemA review of nonlinear FFT-based computational homogenization methodsConvergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimizationSmooth strongly convex interpolation and exact worst-case performance of first-order methodsUnnamed ItemOn the convergence analysis of the optimized gradient methodOn the Asymptotic Linear Convergence Speed of Anderson Acceleration, Nesterov Acceleration, and Nonlinear GMRESAn introduction to continuous optimization for imagingProximal Methods for Sparse Optimal Scoring and Discriminant AnalysisConvergence rates of an inertial gradient descent algorithm under growth and flatness conditionsAnalysis of biased stochastic gradient descent using sequential semidefinite programsMomentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methodsA multivariate adaptive gradient algorithm with reduced tuning effortsAnalysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex ProblemsBounds for the tracking error of first-order online optimization methodsRegularized nonlinear accelerationAnalysis of optimization algorithms via sum-of-squaresProjected Dynamical Systems on Irregular, Non-Euclidean Domains for Nonlinear OptimizationSearch Direction Correction with Normalized Gradient Makes First-Order Methods FasterUnnamed ItemBearing-only distributed localization: a unified barycentric approachIterative pre-conditioning for expediting the distributed gradient-descent method: the case of linear least-squares problemConnections between Georgiou and Smith's Robust Stability Type Theorems and the Nonlinear Small-Gain TheoremsLearning-based adaptive control with an accelerated iterative adaptive lawThe Connections Between Lyapunov Functions for Some Optimization Algorithms and Differential EquationsOn polarization-based schemes for the FFT-based computational homogenization of inelastic materialsA dynamical view of nonlinear conjugate gradient methods with applications to FFT-based computational micromechanicsUnderstanding the acceleration phenomenon via high-resolution differential equationsFrom differential equation solvers to accelerated first-order methods for convex optimizationA control-theoretic perspective on optimal high-order optimizationRobust Accelerated Gradient Methods for Smooth Strongly Convex FunctionsPassivity-based analysis of the ADMM algorithm for constraint-coupled optimizationRobust and structure exploiting optimisation algorithms: an integral quadratic constraint approachAn adaptive Polyak heavy-ball methodOn the convergence analysis of aggregated heavy-ball methodConvex Synthesis of Accelerated Gradient AlgorithmsOn the Convergence Rate of Incremental Aggregated Gradient AlgorithmsExact Worst-Case Performance of First-Order Methods for Composite Convex Optimization


Uses Software



Cites Work




This page was built for publication: Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints