Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization (Q970585)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization |
scientific article |
Statements
Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization (English)
0 references
19 May 2010
0 references
In a previous paper [Numer. Algorithms 47, No. 2, 143--156 (2008; Zbl 1141.65041)], the author presented a hybrid conjugate gradient algorithm as a convex combination of the Hestenes-Stiefel [\textit{M. R. Hestenes} and \textit{E. Stiefel}, J. Res. Natl. Bur. Stand. 49, 409--435 (1952; Zbl 0048.09901)] and the Dai-Yuan [\textit{Y. H. Dai} and \textit{Y. Yuan}, SIAM J. Optim., 10, 177--182 (1999; Zbl 0957.65061)] algorithms, where the parameter in convex combination is computed so that the direction corresponding to the conjugate gradient algorithm can be the best known direction to be followed. This paper presents another variant of the hybrid conjugate gradient algorithm for unconstrained optimization, which perform better and is more robust than the variant using the classical secant condition. The convergence of the method is established. Some numerical experiments on a set of 750 unconstrained optimization test problems show that the new algorithm outperforms the classical Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms and also some other hybrid variants of conjugate gradient algorithms.
0 references
unconstrained optimization
0 references
hybrid conjugate gradient method
0 references
Newton direction
0 references
Modified secant condition
0 references
Numerical comparisons
0 references
0 references