Convergence rates of forward-Douglas-Rachford splitting method
From MaRDI portal
Abstract: Over the past years, operator splitting methods have become ubiquitous for non-smooth optimization owing to their simplicity and efficiency. In this paper, we consider the Forward--Douglas--Rachford splitting method (FDR) [10,40], and study both global and local convergence rates of this method. For the global rate, we establish an convergence rate in terms of a Bregman divergence suitably designed for the objective function. Moreover, when specializing to the case of Forward--Backward splitting method, we show that convergence rate of the objective function of the method is actually for a large choice of the descent step-size. Then locally, based on the assumption that the non-smooth part of the optimization problem is partly smooth, we establish local linear convergence of the method. More precisely, we show that the sequence generated by FDR method first (i) identifies a smooth manifold in a finite number of iteration, and then (ii) enters a local linear convergence regime, which is for instance characterized in terms of the structure of the underlying active smooth manifold. To exemplify the usefulness of the obtained result, we consider several concrete numerical experiments arising from applicative fields including, for instance, signal/image processing, inverse problems and machine learning.
Recommendations
- Convergence rate analysis of the forward-Douglas-Rachford splitting scheme
- Local linear convergence analysis of primal-dual splitting methods
- Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Tight global linear convergence rate bounds for Douglas-Rachford splitting
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 2155014 (Why is no real title available?)
- scientific article; zbMATH DE number 3398324 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A generalized forward-backward splitting
- A note on the forward-Douglas-Rachford splitting for monotone inclusion and convex optimization
- A three-operator splitting scheme and its optimization applications
- A unified approach to error bounds for structured convex optimization problems
- Active Sets, Nonsmoothness, and Sensitivity
- Activity identification and local linear convergence of forward-backward-type methods
- An Extrinsic Look at the Riemannian Hessian
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Convergence rate analysis of the forward-Douglas-Rachford splitting scheme
- Convergence rates with inexact non-expansive operators
- Convex analysis and monotone operator theory in Hilbert spaces
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Error bounds, quadratic growth, and linear convergence of proximal methods
- Forward-Douglas–Rachford splitting and forward-partial inverse method for solving monotone inclusions
- From error bounds to the complexity of first-order descent methods for convex functions
- Introductory lectures on convex optimization. A basic course.
- Linear convergence of iterative soft-thresholding
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Local linear convergence analysis of primal-dual splitting methods
- MultiDimensional Sparse Super-Resolution
- NON-STRICTLY CONVEX MINIMIZATION OVER THE FIXED POINT SET OF AN ASYMPTOTICALLY SHRINKING NONEXPANSIVE MAPPING
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Quasi-Fejérian analysis of some optimization algorithms
- Riemannian geometry. A modern introduction
- Signal Recovery by Proximal Forward-Backward Splitting
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Sparsity and Smoothness Via the Fused Lasso
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The degrees of freedom of partly smooth regularizers
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Variational Analysis
Cited in
(18)- Envelope functions: unifications and further properties
- Fast iterative regularization by reusing data
- Local linear convergence analysis of primal-dual splitting methods
- Activity identification and local linear convergence of forward-backward-type methods
- Convergence rate analysis of the forward-Douglas-Rachford splitting scheme
- Tight global linear convergence rate bounds for Douglas-Rachford splitting
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- Convergence Rate Analysis of Primal-Dual Splitting Schemes
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- On the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems
- Convergence rate analysis of several splitting schemes
- On the linear convergence rate of a relaxed forward–backward splitting method
- Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
- Primal-dual fixed point algorithm based on adapted metric method for solving convex minimization problem with application
- Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
- An improved parameterized fast iterative shrinkage-thresholding algorithm with adaptive step size and its applications
- Improving ``fast iterative shrinkage-thresholding algorithm: faster, smarter, and greedier
This page was built for publication: Convergence rates of forward-Douglas-Rachford splitting method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2317846)