Two new conjugate gradient methods based on modified secant equations
From MaRDI portal
Publication:972741
DOI10.1016/j.cam.2010.01.052zbMath1202.65071WikidataQ57952896 ScholiaQ57952896MaRDI QIDQ972741
Nezam Mahdavi-Amiri, Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 21 May 2010
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2010.01.052
unconstrained optimization; global convergence; conjugate gradient method; conjugacy condition; modified secant equation
65K05: Numerical mathematical programming methods
Related Items
Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, A descent family of Dai–Liao conjugate gradient methods, Two modified scaled nonlinear conjugate gradient methods, A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence, An improved nonlinear conjugate gradient method with an optimal property, Two modified three-term conjugate gradient methods with sufficient descent property, A modified BFGS algorithm based on a hybrid secant equation, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, A new conjugate gradient algorithm for training neural networks based on a modified secant equation, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A modified Perry conjugate gradient method and its global convergence, A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter, A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Modified two-point stepsize gradient methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Using function-values in multi-step quasi-Newton methods
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- Unified approach to quadratically convergent algorithms for function minimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Updating conjugate directions by the BFGS formula
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- Numerical Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- An efficient hybrid conjugate gradient method for unconstrained optimization