Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
From MaRDI portal
(Redirected from Publication:737229)
Abstract: This paper considers the fixed point problem for a nonexpansive mapping on a real Hilbert space and proposes novel line search fixed point algorithms to accelerate the search. The termination conditions for the line search are based on the well-known Wolfe conditions that are used to ensure the convergence and stability of unconstrained optimization algorithms. The directions to search for fixed points are generated by using the ideas of the steepest descent direction and conventional nonlinear conjugate gradient directions for unconstrained optimization. We perform convergence as well as convergence rate analyses on the algorithms for solving the fixed point problem under certain assumptions. The main contribution of this paper is to make a concrete response to an issue of constrained smooth convex optimization; that is, whether or not we can devise nonlinear conjugate gradient algorithms to solve constrained smooth convex optimization problems. We show that the proposed fixed point algorithms include ones with nonlinear conjugate gradient directions which can solve constrained smooth convex optimization problems. To illustrate the practicality of the algorithms, we apply them to concrete constrained smooth convex optimization problems, such as constrained quadratic programming problems and generalized convex feasibility problems, and numerically compare them with previous algorithms based on the Krasnosel'skiui-Mann fixed point algorithm. The results show that the proposed algorithms dramatically reduce the running time and iterations needed to find optimal solutions to the concrete optimization problems compared with the previous algorithms.
Recommendations
- Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping
- Proximal point algorithms for nonsmooth convex optimization with fixed point constraints
Cites work
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 3853749 (Why is no real title available?)
- scientific article; zbMATH DE number 47597 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- scientific article; zbMATH DE number 3108780 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A dynamical system associated with the fixed points set of a nonexpansive operator
- A survey of nonlinear conjugate gradient methods
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Algorithm 851
- Approximation of fixed points of nonexpansive mappings
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Convex analysis and monotone operator theory in Hilbert spaces
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Fixed points of nonexpanding maps
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Hard-constrained inconsistent signal feasibility problems
- Iterative algorithm for solving triple-hierarchical constrained optimization problem
- Iterative algorithm for triple-hierarchical constrained nonconvex optimization problem and its application to network bandwidth allocation
- Iterative approximation of fixed points
- Mean Value Methods in Iteration
- Methods of conjugate gradients for solving linear systems
- Nonlinear functional analysis. Fixed point theory and its applications
- Nonsmooth optimization via quasi-Newton methods
- On Projection Algorithms for Solving Convex Feasibility Problems
- On the rate of convergence of Krasnosel'skiĭ-Mann iterations and their connection with sums of Bernoullis
- Solving variational inequality and fixed point problems by line searches and potential optimization
- Strong convergence theorems for nonexpansive mappings and nonexpansive semigroups.
- The conjugate gradient method in extremal problems
- The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
Cited in
(3)
This page was built for publication: Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q737229)