Linear and strong convergence of algorithms involving averaged nonexpansive operators
From MaRDI portal
(Redirected from Publication:401327)
Abstract: We introduce regularity notions for averaged nonexpansive operators. Combined with regularity notions of their fixed point sets, we obtain linear and strong convergence results for quasicyclic, cyclic, and random iterations. New convergence results on the Borwein-Tam method (BTM) and on the cylically anchored Douglas-Rachford algorithm (CADRA) are also presented. Finally, we provide a numerical comparison of BTM, CADRA and the classical method of cyclic projections for solving convex feasibility problems.
Recommendations
- Convergence rate analysis for averaged fixed point iterations in common fixed point problems
- Conical averagedness and convergence analysis of fixed point algorithms
- Successive Averages of Firmly Nonexpansive Mappings
- Regular Sequences of Quasi-Nonexpansive Operators and Their Applications
- A cyclic Douglas-Rachford iteration scheme
Cites work
- scientific article; zbMATH DE number 3853749 (Why is no real title available?)
- scientific article; zbMATH DE number 47597 (Why is no real title available?)
- scientific article; zbMATH DE number 515862 (Why is no real title available?)
- scientific article; zbMATH DE number 1382772 (Why is no real title available?)
- scientific article; zbMATH DE number 3365044 (Why is no real title available?)
- A Norm Convergence Result on Random Products of Relaxed Projections in Hilbert Space
- A cyclic Douglas-Rachford iteration scheme
- About regularity of collections of sets
- Best approximation in inner product spaces
- Convergence of the Proximal Point Method for Metrically Regular Mappings
- Convex analysis and monotone operator theory in Hilbert spaces
- Finding best approximation pairs relative to two closed convex sets in Hilbert spaces
- Generalized solutions for the sum of two maximally monotone operators
- Iterative methods for fixed point problems in Hilbert spaces
- Metric subregularity and the proximal point method
- On Projection Algorithms for Solving Convex Feasibility Problems
- On the convergence of von Neumann's alternating projection algorithm for two sets
- Restricted normal cones and the method of alternating projections: theory
- Robinson's implicit function theorem and its extensions
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Strong Abadie CQ, ACQ, calmness and linear regularity
- Surrogate Projection Methods for Finding Fixed Points of Firmly Nonexpansive Mappings
- The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings
- The rate of convergence for the cyclic projections algorithm. III: Regularity of convex sets
- The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle
Cited in
(50)- Various Notions of Nonexpansiveness Coincide for Proximal Mappings of Functions
- Quadratic growth conditions and uniqueness of optimal solution to Lasso
- Local linear convergence of the ADMM/Douglas-Rachford algorithms without strong convexity and application to statistical imaging
- Convergence analysis of Douglas-Rachford splitting method for ``strongly + weakly convex programming
- Convergence rate analysis for averaged fixed point iterations in common fixed point problems
- New Douglas-Rachford algorithmic structures and their convergence analyses
- Convergence analysis of the generalized Douglas-Rachford splitting method under Hölder subregularity assumptions
- On the order of the operators in the Douglas-Rachford algorithm
- On the circumcentered-reflection method for the convex feasibility problem
- Weak, strong and linear convergence of the CQ-method via the regularity of Landweber operators
- Weak, strong, and linear convergence of a double-layer fixed point algorithm
- On Slater's condition and finite convergence of the Douglas-Rachford algorithm for solving convex feasibility problems in Euclidean spaces
- Outer approximation methods for solving variational inequalities in Hilbert space
- Error bounds for the method of simultaneous projections with infinitely many subspaces
- SURVEY: SIXTY YEARS OF DOUGLAS–RACHFORD
- Convergence rates for boundedly regular systems
- Regularity of sets under a reformulation in a product space with reduced dimension
- Circumcentering the Douglas-Rachford method
- Convergence rate of a gradient projection method for solving variational inequalities
- Globally Convergent Type-I Anderson Acceleration for Nonsmooth Fixed-Point Iterations
- Convergence properties of dynamic string-averaging projection methods in the presence of perturbations
- Regular Sequences of Quasi-Nonexpansive Operators and Their Applications
- On the finite convergence of the Douglas-Rachford algorithm for solving (not necessarily convex) feasibility problems in Euclidean spaces
- Projection methods for high numerical aperture phase retrieval
- Convergence rates for the relaxed Peaceman-Rachford splitting method on a monotone inclusion problem
- Extrapolated cyclic subgradient projection methods for the convex feasibility problems and their numerical behaviour
- Conical averagedness and convergence analysis of fixed point algorithms
- Linear convergence of the generalized Douglas-Rachford algorithm for feasibility problems
- On the linear convergence of the circumcentered-reflection method
- Convergence rate of the relaxed CQ algorithm under Hölderian type error bound property
- Complexity of the relaxed Peaceman-Rachford splitting method for the sum of two maximal strongly monotone operators
- Linear convergence of the Douglas-Rachford method for two closed sets
- Optimization on Spheres: Models and Proximal Algorithms with Computational Performance Comparisons
- Convergence rates with inexact non-expansive operators
- Union averaged operators with applications to proximal algorithms for MIN-convex functions
- Finitely convergent iterative methods with overrelaxations revisited
- An improved relaxed inertial projection algorithm for solving the minimum-norm solution of variational inequality and fixed point problems
- The Douglas-Rachford algorithm for convex and nonconvex feasibility problems
- Deep neural network structures solving variational inequalities
- Linear convergence of projection algorithms
- Randomized Douglas–Rachford Methods for Linear Systems: Improved Accuracy and Efficiency
- Iteration process for fixed point problems and zeros of maximal monotone operators
- Best approximation mappings in Hilbert spaces
- Implicit error bounds for Picard iterations on Hilbert spaces
- Reflection Methods for Inverse Problems with Applications to Protein Conformation Determination
- A modular string averaging procedure for solving the common fixed point problem for quasi-nonexpansive mappings in Hilbert space
- Linear convergence rates for extrapolated fixed point algorithms
- Finitely convergent deterministic and stochastic iterative methods for solving convex feasibility problems
- Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry
- Constraint reduction reformulations for projection algorithms with applications to wavelet construction
This page was built for publication: Linear and strong convergence of algorithms involving averaged nonexpansive operators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q401327)