Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
From MaRDI portal
Publication:2662276
Abstract: Many large-scale and distributed optimization problems can be brought into a composite form in which the objective function is given by the sum of a smooth term and a nonsmooth regularizer. Such problems can be solved via a proximal gradient method and its variants, thereby generalizing gradient descent to a nonsmooth setup. In this paper, we view proximal algorithms as dynamical systems and leverage techniques from control theory to study their global properties. In particular, for problems with strongly convex objective functions, we utilize the theory of integral quadratic constraints to prove the global exponential stability of the equilibrium points of the differential equations that govern the evolution of proximal gradient and Douglas-Rachford splitting flows. In our analysis, we use the fact that these algorithms can be interpreted as variable-metric gradient methods on the suitable envelopes and exploit structural properties of the nonlinear terms that arise from the gradient of the smooth part of the objective function and the proximal operator associated with the nonsmooth regularizer. We also demonstrate that these envelopes can be obtained from the augmented Lagrangian associated with the original nonsmooth problem and establish conditions for global exponential convergence even in the absence of strong convexity.
Recommendations
- Exponential stability of partial primal-dual gradient dynamics with nonsmooth objective functions
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- A Proximal Minimization Algorithm for Structured Nonconvex and Nonsmooth Problems
- Semi-global exponential stability of augmented primal-dual gradient dynamics for constrained convex optimization
Cites work
- scientific article; zbMATH DE number 3148887 (Why is no real title available?)
- scientific article; zbMATH DE number 3313108 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- A dynamical systems approach to constrained minimization
- A variational perspective on accelerated methods in optimization
- Analysis and design of optimization algorithms via integral quadratic constraints
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Asymptotic convergence of constrained primal-dual dynamics
- Distributed Subgradient Methods for Multi-Agent Optimization
- Exponential Decay Rate Conditions for Uncertain Linear Systems Using Integral Quadratic Constraints
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations
- Stability of primal-dual gradient dynamics and applications to network optimization
- System analysis via integral quadratic constraints
- The Proximal Augmented Lagrangian Method for Nonsmooth Composite Optimization
- The Role of Convexity in Saddle-Point Dynamics: Lyapunov Function and Robustness
Cited in
(7)- Passivity-based analysis of the ADMM algorithm for constraint-coupled optimization
- Nonlinear optimization filters for stochastic time-varying convex optimization
- Exponential stability of partial primal-dual gradient dynamics with nonsmooth objective functions
- A new dynamical system with self-adaptive dynamical stepsize for pseudomonotone mixed variational inequalities
- Neurodynamic approaches for multi-agent distributed optimization
- A frequency-domain analysis of inexact gradient methods
- A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem
This page was built for publication: Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2662276)