Conjugate gradient methods using value of objective function for unconstrained optimization
From MaRDI portal
Publication:1758034
DOI10.1007/s11590-011-0324-0zbMath1278.90382OpenAlexW1993524345MaRDI QIDQ1758034
Yasushi Narushima, Hideaki Iiduka
Publication date: 7 November 2012
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-011-0324-0
Related Items
A New Formula on the Conjugate Gradient Method for Removing Impulse Noise Images ⋮ A memory gradient method based on the nonmonotone technique ⋮ A new type of descent conjugate gradient method with exact line search
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a memory gradient method for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping
- A conjugate direction algorithm without line searches
- Flow search approach and new bounds for the \(m\)-step linear conjugate gradient algorithm
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Memory gradient method for the minimization of functions
- Study on a supermemory gradient method for the minimization of functions
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Conjugate gradient methods using value of objective function for unconstrained optimization