Convergence rates of forward-Douglas-Rachford splitting method
From MaRDI portal
Publication:2317846
DOI10.1007/s10957-019-01524-9zbMath1421.49014arXiv1801.01088OpenAlexW2962827308WikidataQ91864464 ScholiaQ91864464MaRDI QIDQ2317846
Cesare Molinari, Jingwei Liang, Jalal Fadili
Publication date: 13 August 2019
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.01088
Bregman distancefinite identificationforward-backwardlocal linear convergenceforward-Douglas-Rachford splitting methodpartial smoothness
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10) Nonsmooth analysis (49J52)
Related Items
Improving “Fast Iterative Shrinkage-Thresholding Algorithm”: Faster, Smarter, and Greedier ⋮ Primal-dual fixed point algorithm based on adapted metric method for solving convex minimization problem with application ⋮ Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming ⋮ Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization ⋮ Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Convergence rates with inexact non-expansive operators
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- A three-operator splitting scheme and its optimization applications
- Linear convergence of iterative soft-thresholding
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Introductory lectures on convex optimization. A basic course.
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- A note on the forward-Douglas-Rachford splitting for monotone inclusion and convex optimization
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- The degrees of freedom of partly smooth regularizers
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Generalized Forward-Backward Splitting
- Convergence Rate Analysis of the Forward-Douglas-Rachford Splitting Scheme
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Variational Analysis
- NON-STRICTLY CONVEX MINIMIZATION OVER THE FIXED POINT SET OF AN ASYMPTOTICALLY SHRINKING NONEXPANSIVE MAPPING
- MultiDimensional Sparse Super-Resolution
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Sparsity and Smoothness Via the Fused Lasso
- Active Sets, Nonsmoothness, and Sensitivity
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Forward-Douglas–Rachford splitting and forward-partial inverse method for solving monotone inclusions
- An Extrinsic Look at the Riemannian Hessian
- Signal Recovery by Proximal Forward-Backward Splitting
- Local linear convergence analysis of Primal–Dual splitting methods
- Convex analysis and monotone operator theory in Hilbert spaces