Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
From MaRDI portal
Publication:6168888
DOI10.5802/ojmo.26zbMath1516.90049arXiv2206.03041OpenAlexW3203904555MaRDI QIDQ6168888
Publication date: 9 August 2023
Published in: OJMO. Open Journal of Mathematical Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2206.03041
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Iterative procedures involving nonlinear operators (47J25)
Related Items
Cites Work
- Smooth minimization of non-smooth functions
- Convergence rates with inexact non-expansive operators
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Lectures on convex optimization
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- From error bounds to the complexity of first-order descent methods for convex functions
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Dualize, split, randomize: toward fast nonsmooth optimization algorithms
- Linear convergence of primal-dual gradient methods and their performance in distributed optimization
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Unified linear convergence of first-order primal-dual algorithms for saddle point problems
- Implicit Functions and Solution Mappings
- Variational Analysis
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- On the Convergence of Stochastic Primal-Dual Hybrid Gradient
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- Convergence Rate Analysis of Several Splitting Schemes
- Convex analysis and monotone operator theory in Hilbert spaces
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- Faster first-order primal-dual methods for linear programming using restarts and sharpness