Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization (Q2390003)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization |
scientific article |
Statements
Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization (English)
0 references
20 July 2009
0 references
A new algoritm for acceleration of the conjugate gradient method is developed by computing the parameter \(\beta_k\) using a finite difference approximation of the Hessian/vector product. The search direction is also computed using the forward difference approximation of the Hessian/vector product. In contrast to the Newton and quasi Newton methods, for the conjugate gradient method the step lengths may differ from 1 depending how the problem is scaled. The authors suggests a modification of the step length which reduces number of function evaluations compared to the currently available conjugate gradient algorithms. It is proved that the method is globally convergent and its convergence rate is linear for uniformly convex functions; but reduction of function values is significantly improved. The performance of the suggested method is demonstrated for unconstrained 750 large scale test problems by comparing it with the conjugate algorithms like CONMIN, SCALCG and the truncated Newton methods.
0 references
unconstrained optimization
0 references
conjugate gradient method
0 references
Newton directions
0 references
large scale test examples
0 references
numerical comparisons
0 references
convergence acceleration
0 references
algorithm
0 references
forward difference approximation of Hessian/vector product
0 references
0 references
0 references
0 references
0 references