Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
From MaRDI portal
Publication:1876593
DOI10.1023/B:COAP.0000026885.81997.88zbMath1056.90130OpenAlexW2029493470MaRDI QIDQ1876593
Publication date: 20 August 2004
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/b:coap.0000026885.81997.88
unconstrained optimizationglobal convergenceconjugate gradient methodline searchmodified secant condition
Related Items (67)
On Hager and Zhang's conjugate gradient method with guaranteed descent ⋮ Sufficient descent conjugate gradient methods for large-scale optimization problems ⋮ Spectral method and its application to the conjugate gradient method ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family ⋮ Sufficient descent nonlinear conjugate gradient methods with conjugacy condition ⋮ A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations ⋮ A modified conjugate gradient method based on a modified secant equation ⋮ A class of one parameter conjugate gradient methods ⋮ An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems ⋮ A modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equations ⋮ A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations ⋮ A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence ⋮ An adaptive modified three-term conjugate gradient method with global convergence ⋮ A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model ⋮ On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications ⋮ A new conjugate gradient algorithm for training neural networks based on a modified secant equation ⋮ Two-step conjugate gradient method for unconstrained optimization ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ A modified conjugate gradient method for general convex functions ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization ⋮ A modified conjugacy condition and related nonlinear conjugate gradient method ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ An improved nonlinear conjugate gradient method with an optimal property ⋮ A novel value for the parameter in the Dai-Liao-type conjugate gradient method ⋮ A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations ⋮ Multi-step nonlinear conjugate gradient methods for unconstrained minimization ⋮ A limited memory descent Perry conjugate gradient method ⋮ Conjugate gradient methods using value of objective function for unconstrained optimization ⋮ Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ Some nonlinear conjugate gradient methods based on spectral scaling secant equations ⋮ A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restoration ⋮ Convergence of the descent Dai–Yuan conjugate gradient method for unconstrained optimization ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ On the extension of Dai-Liao conjugate gradient method for vector optimization ⋮ A conjugate gradient method with sufficient descent property ⋮ Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ A new accelerated conjugate gradient method for large-scale unconstrained optimization ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ A truncated descent HS conjugate gradient method and its global convergence ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ A modified bat algorithm with conjugate gradient method for global optimization ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ A nonlinear conjugate gradient method based on the MBFGS secant condition ⋮ A new hybrid algorithm for convex nonlinear unconstrained optimization ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter ⋮ A modified Perry conjugate gradient method and its global convergence
Uses Software
This page was built for publication: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition