Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
DOI10.1016/J.AUTOMATICA.2020.109311zbMATH Open1461.93433arXiv1908.09043OpenAlexW3097133297WikidataQ115583226 ScholiaQ115583226MaRDI QIDQ2662276FDOQ2662276
Authors: Sepideh Hassan-Moghaddam, Mihailo R. Jovanović
Publication date: 12 April 2021
Published in: Automatica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.09043
Recommendations
- Exponential stability of partial primal-dual gradient dynamics with nonsmooth objective functions
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- A Proximal Minimization Algorithm for Structured Nonconvex and Nonsmooth Problems
- Semi-global exponential stability of augmented primal-dual gradient dynamics for constrained convex optimization
nonsmooth optimizationnonlinear dynamicsDouglas-Rachford splittingglobal exponential stabilityproximal algorithmsprimal-dual methodsintegral quadratic constraints (IQCs)proximal augmented Lagrangiancontrol for optimizationforward-backward envelope
Existence theories for free problems in one independent variable (49J05) Exponential stability (93D23)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Analysis and design of optimization algorithms via integral quadratic constraints
- System analysis via integral quadratic constraints
- Distributed Subgradient Methods for Multi-Agent Optimization
- Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations
- Title not available (Why is that?)
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- Stability of primal-dual gradient dynamics and applications to network optimization
- Asymptotic convergence of constrained primal-dual dynamics
- Title not available (Why is that?)
- The Role of Convexity in Saddle-Point Dynamics: Lyapunov Function and Robustness
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- The Proximal Augmented Lagrangian Method for Nonsmooth Composite Optimization
- A variational perspective on accelerated methods in optimization
- A dynamical systems approach to constrained minimization
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Exponential Decay Rate Conditions for Uncertain Linear Systems Using Integral Quadratic Constraints
Cited In (7)
- Passivity-based analysis of the ADMM algorithm for constraint-coupled optimization
- Nonlinear optimization filters for stochastic time-varying convex optimization
- A new dynamical system with self-adaptive dynamical stepsize for pseudomonotone mixed variational inequalities
- Neurodynamic approaches for multi-agent distributed optimization
- Exponential stability of partial primal-dual gradient dynamics with nonsmooth objective functions
- A frequency-domain analysis of inexact gradient methods
- A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem
This page was built for publication: Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2662276)