Conjugate gradient methods using value of objective function for unconstrained optimization
From MaRDI portal
Publication:1758034
DOI10.1007/S11590-011-0324-0zbMATH Open1278.90382OpenAlexW1993524345MaRDI QIDQ1758034FDOQ1758034
Authors: Hideaki Iiduka, Yasushi Narushima
Publication date: 7 November 2012
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-011-0324-0
Recommendations
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- scientific article; zbMATH DE number 2196505
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Two new conjugate gradient methods for unconstrained optimization
- New conjugate gradient method for unconstrained optimization
Cites Work
- Algorithm 851
- CUTE
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- A conjugate direction algorithm without line searches
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Handbook of applied optimization
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A survey of nonlinear conjugate gradient methods
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Memory gradient method for the minimization of functions
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
- Global convergence of a memory gradient method for unconstrained optimization
- Study on a supermemory gradient method for the minimization of functions
- Flow search approach and new bounds for the \(m\)-step linear conjugate gradient algorithm
Cited In (5)
- Influence to new formulas gradient for removing impulse noise images
- A memory gradient method based on the nonmonotone technique
- A New Formula on the Conjugate Gradient Method for Removing Impulse Noise Images
- A new type of descent conjugate gradient method with exact line search
- A gradient-only line search method for the conjugate gradient method applied to constrained optimization problems with severe noise in the objective function
Uses Software
This page was built for publication: Conjugate gradient methods using value of objective function for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1758034)