Extrapolated sequential constraint method for variational inequality over the intersection of fixed-point sets
From MaRDI portal
Publication:2234474
DOI10.1007/s11075-021-01067-zzbMath1486.65068arXiv2006.16217OpenAlexW3135954056MaRDI QIDQ2234474
Mootta Prangprakhon, Nimit Nimana
Publication date: 19 October 2021
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.16217
Fixed-point theorems (47H10) Numerical methods for variational inequalities and related problems (65K15)
Related Items
A dynamic distributed conjugate gradient method for variational inequality problem over the common fixed-point constraints ⋮ Convergence of a distributed method for minimizing sum of convex functions with fixed point constraints
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Iterative methods for fixed point problems in Hilbert spaces
- Extrapolation and local acceleration of an iterative process for common fixed point problems
- Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- An acceleration scheme for row projection methods
- Viscosity approximation process for a sequence of quasinonexpansive mappings
- Convex optimization over fixed point sets of quasi-nonexpansive and nonexpansive mappings in utility-based bandwidth allocation problems with operational constraints
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Opial-Type Theorems and the Common Fixed Point Problem
- Outer approximation methods for solving variational inequalities in Hilbert space
- Iterative Algorithms for Nonlinear Operators
- A Sequential Constraint Method for Solving Variational Inequality over the Intersection of Fixed Point Sets
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Application of Quasi-Nonexpansive Operators to an Iterative Method for Variational Inequality
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Numerical Optimization
- Extrapolated cyclic subgradient projection methods for the convex feasibility problems and their numerical behaviour
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Methods for Variational Inequality Problem Over the Intersection of Fixed Point Sets of Quasi-Nonexpansive Operators
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions
- Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms
- Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- Convex programming in Hilbert space
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- Convex analysis and monotone operator theory in Hilbert spaces