Convergence properties of the Fletcher-Reeves method
From MaRDI portal
Publication:4874974
DOI10.1093/imanum/16.2.155zbMath0851.65049OpenAlexW1995409333MaRDI QIDQ4874974
Publication date: 26 November 1996
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1093/imanum/16.2.155
unconstrained optimizationglobal convergencenonlinear programminginexact line searchconjugate gradientsFletcher-Reeves method
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Related Items (53)
Some modified conjugate gradient methods for unconstrained optimization ⋮ A family of hybrid conjugate gradient methods for unconstrained optimization ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ Convergence properties of a correlation Polak-Ribiére conjugate gradient method ⋮ A new family of globally convergent conjugate gradient methods ⋮ Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods ⋮ Determination of the space-dependent source term in a fourth-order parabolic problem ⋮ On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization ⋮ Convergence properties of the dependent PRP conjugate gradient methods ⋮ New conjugate gradient-like methods for unconstrained optimization ⋮ The convergence properties of some new conjugate gradient methods ⋮ A gradient-related algorithm with inexact line searches ⋮ Globally convergence of nonlinear conjugate gradient method for unconstrained optimization ⋮ Convergence of Liu-Storey conjugate gradient method ⋮ A descent nonlinear conjugate gradient method for large-scale unconstrained optimization ⋮ A new variant of the memory gradient method for unconstrained optimization ⋮ Two diagonal conjugate gradient like methods for unconstrained optimization ⋮ Recovery of the time‐dependent zero‐order coefficient in a fourth‐order parabolic problem ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ On the convergence rate of Fletcher‐Reeves nonlinear conjugate gradient methods satisfying strong Wolfe conditions: Application to parameter identification in problems governed by general dynamics ⋮ Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization ⋮ Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping ⋮ Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search ⋮ The global convergence of a new mixed conjugate gradient method for unconstrained optimization ⋮ GLOBAL CONVERGENCE OF A SPECIAL CASE OF THE DAI–YUAN FAMILY WITHOUT LINE SEARCH ⋮ Global convergence of some modified PRP nonlinear conjugate gradient methods ⋮ A modified conjugacy condition and related nonlinear conjugate gradient method ⋮ A hybrid of DL and WYL nonlinear conjugate gradient methods ⋮ A three-parameter family of nonlinear conjugate gradient methods ⋮ Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method ⋮ Further insight into the convergence of the Fletcher-Reeves method ⋮ Memory gradient method with Goldstein line search ⋮ On Conjugate Gradient Algorithms as Objects of Scientific Study ⋮ Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems ⋮ Some descent three-term conjugate gradient methods and their global convergence ⋮ Global convergence property of \(s\)-dependent GFR conjugate gradient method ⋮ Some sufficient descent conjugate gradient methods and their global convergence ⋮ New hybrid conjugate gradient method as a convex combination of LS and CD methods ⋮ A new family of conjugate gradient methods ⋮ Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping ⋮ Simultaneous reconstruction of the perfusion coefficient and initial temperature from time-average integral temperature measurements ⋮ Convergence of conjugate gradient methods with constant stepsizes ⋮ Simultaneous identification and reconstruction of the space-dependent reaction coefficient and source term ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ Convergence properties of the Beale-Powell restart algorithm ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new two-parameter family of nonlinear conjugate gradient methods ⋮ Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods” ⋮ A new spectral conjugate gradient method for large-scale unconstrained optimization ⋮ Conjugate gradient methods with Armijo-type line searches.
This page was built for publication: Convergence properties of the Fletcher-Reeves method