New Douglas-Rachford algorithmic structures and their convergence analyses
From MaRDI portal
Abstract: In this paper we study new algorithmic structures with Douglas- Rachford (DR) operators to solve convex feasibility problems. We propose to embed the basic two-set-DR algorithmic operator into the String-Averaging Projections (SAP) and into the Block-Iterative Pro- jection (BIP) algorithmic structures, thereby creating new DR algo- rithmic schemes that include the recently proposed cyclic Douglas- Rachford algorithm and the averaged DR algorithm as special cases. We further propose and investigate a new multiple-set-DR algorithmic operator. Convergence of all these algorithmic schemes is studied by using properties of strongly quasi-nonexpansive operators and firmly nonexpansive operators.
Recommendations
- The cyclic Douglas–Rachford algorithm with r-sets-Douglas–Rachford operators
- A convergent relaxation of the Douglas-Rachford algorithm
- On the local convergence of the Douglas-Rachford algorithm
- A cyclic Douglas-Rachford iteration scheme
- On Slater's condition and finite convergence of the Douglas-Rachford algorithm for solving convex feasibility problems in Euclidean spaces
Cites work
- A cyclic Douglas-Rachford iteration scheme
- A strongly convergent reflection method for finding the projection onto the intersection of two closed convex sets in a Hilbert space
- Averaging strings of sequential iterations for convex feasibility problems.
- Block-Iterative Algorithms with Diagonally Scaled Oblique Projections for the Linear Feasibility Problem
- Block-iterative projection methods for parallel computation of solutions to convex feasibility problems
- Component-Averaged Row Projections: A Robust, Block-Parallel Scheme for Sparse Linear Systems
- Compositions and convex combinations of asymptotically regular firmly nonexpansive mappings are also asymptotically regular
- Convergence and perturbation resilience of dynamic string-averaging projection methods
- Convergence of String-Averaging Projection Schemes for Inconsistent Convex Feasibility Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Douglas-Rachford feasibility methods for matrix completion problems
- Finding best approximation pairs relative to two closed convex sets in Hilbert spaces
- Global convergence of a non-convex Douglas-Rachford iteration
- Iterative methods for fixed point problems in Hilbert spaces
- Linear and strong convergence of algorithms involving averaged nonexpansive operators
- On Diagonally Relaxed Orthogonal Projection Methods
- On Projection Algorithms for Solving Convex Feasibility Problems
- On the behavior of a block-iterative projection method for solving convex feasibility problems
- On the local convergence of the Douglas-Rachford algorithm
- On the string averaging method for sparse common fixed-point problems
- On weak convergence of the Douglas-Rachford method
- Proximal splitting methods in signal processing
- Recent results on Douglas-Rachford methods for combinatorial optimization problems
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- The Douglas-Rachford algorithm in the absence of convexity
- The composition of projections onto closed convex sets in Hilbert space is asymptotically regular
- The cyclic Douglas-Rachford method for inconsistent feasibility problems
Cited in
(13)- Randomized Douglas–Rachford Methods for Linear Systems: Improved Accuracy and Efficiency
- Gearhart-Koshy acceleration for affine subspaces
- Unrestricted Douglas-Rachford algorithms for solving convex feasibility problems in Hilbert space
- Approximate Douglas-Rachford algorithm for two-sets convex feasibility problems
- Linear convergence rates for extrapolated fixed point algorithms
- The cyclic Douglas–Rachford algorithm with r-sets-Douglas–Rachford operators
- A randomized block Douglas-Rachford method for solving linear matrix equation
- A convergent relaxation of the Douglas-Rachford algorithm
- A new step size selection strategy for the superiorization methodology using subgradient vectors and its application for solving convex constrained optimization problems
- Deep neural network structures solving variational inequalities
- SURVEY: SIXTY YEARS OF DOUGLAS–RACHFORD
- The Douglas-Rachford algorithm for convex and nonconvex feasibility problems
- A cyclic Douglas-Rachford iteration scheme
This page was built for publication: New Douglas-Rachford algorithmic structures and their convergence analyses
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2789608)