Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
From MaRDI portal
Publication:737229
DOI10.1186/s13663-016-0567-7zbMath1342.47087arXiv1509.05605WikidataQ59470425 ScholiaQ59470425MaRDI QIDQ737229
Publication date: 9 August 2016
Published in: Fixed Point Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1509.05605
nonexpansive mapping; nonlinear conjugate gradient methods; fixed point problem; line search method; constrained smooth convex optimization; generalized convex feasibility problem; Krasnosel'skiĭ-Mann fixed point algorithm
65K05: Numerical mathematical programming methods
90C25: Convex programming
47N10: Applications of operator theory in optimization, convex analysis, mathematical programming, economics
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonsmooth optimization via quasi-Newton methods
- A dynamical system associated with the fixed points set of a nonexpansive operator
- Iterative algorithm for solving triple-hierarchical constrained optimization problem
- Iterative approximation of fixed points
- Approximation of fixed points of nonexpansive mappings
- Solving variational inequality and fixed point problems by line searches and potential optimization
- Strong convergence theorems for nonexpansive mappings and nonexpansive semigroups.
- Forcing strong convergence of proximal point iterations in a Hilbert space
- On the rate of convergence of Krasnosel'skiĭ-Mann iterations and their connection with sums of Bernoullis
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Hard-constrained inconsistent signal feasibility problems
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On Projection Algorithms for Solving Convex Feasibility Problems
- Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Fixed points of nonexpanding maps
- Convergence Conditions for Ascent Methods
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Mean Value Methods in Iteration
- Convex analysis and monotone operator theory in Hilbert spaces