A fundamental proof of convergence of alternating direction method of multipliers for weakly convex optimization
From MaRDI portal
Publication:2067860
DOI10.1186/s13660-019-2080-0zbMath1499.90164OpenAlexW2946936191WikidataQ127941685 ScholiaQ127941685MaRDI QIDQ2067860
Publication date: 19 January 2022
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-019-2080-0
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90) Numerical optimization and variational techniques (65K10)
Related Items (2)
A Splitting Scheme for Flip-Free Distortion Energies ⋮ An adaptive alternating direction method of multipliers
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Precompact convergence of the nonconvex primal-dual hybrid gradient algorithm
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- The Douglas-Rachford algorithm in the affine-convex case
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function
- On the global and linear convergence of the generalized alternating direction method of multipliers
- Linear Convergence of the Alternating Direction Method of Multipliers for a Class of Convex Optimization Problems
- Hankel Matrix Rank Minimization with Applications to System Identification and Realization
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- Global Convergence of Splitting Methods for Nonconvex Composite Optimization
- Variational Analysis
- On the Convergence of the Iterative Shrinkage/Thresholding Algorithm With a Weakly Convex Penalty
- Fast Alternating Direction Optimization Methods
- Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Programming
- The Primal-Dual Hybrid Gradient Method for Semiconvex Splittings
- Convergence Analysis of Douglas--Rachford Splitting Method for “Strongly + Weakly” Convex Programming
- On the Convergence of Block Coordinate Descent Type Methods
- Convex Analysis
This page was built for publication: A fundamental proof of convergence of alternating direction method of multipliers for weakly convex optimization