New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
Publication:875393
DOI10.1016/J.CAM.2006.03.005zbMath1116.65069OpenAlexW2083770700WikidataQ59241592 ScholiaQ59241592MaRDI QIDQ875393
Chun-Ming Tang, Zeng-xin Wei, Guoyin Li
Publication date: 13 April 2007
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2006.03.005
algorithmunconstrained optimizationglobal convergencenumerical examplesconjugate gradient methodconjugacy conditionquasi-Newton equation
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- A new method for nonsmooth convex optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- A three-parameter family of nonlinear conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence analysis of a proximal newton method1
- Proximité et dualité dans un espace hilbertien
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- New conjugacy conditions and related nonlinear conjugate gradient methods
Related Items (75)
Uses Software
This page was built for publication: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization