A new algorithm of nonlinear conjugate gradient method with strong convergence (Q959477)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A new algorithm of nonlinear conjugate gradient method with strong convergence |
scientific article |
Statements
A new algorithm of nonlinear conjugate gradient method with strong convergence (English)
0 references
11 December 2008
0 references
This paper considers an unconstrained minimization problem: \[ \text{min} f(x), \quad x \in\mathbb R^n \] Where \(\mathbb R^n\) is an dimensioinal Euclidean space and \(f:\mathbb R^n\to\mathbb R\) is a continuous differentiable function. If \(n\) is very large, this problem is a large scale problem. Motivated by \textit{Y.-X. Yuan} and \textit{J. Stoer}'s conjugate gradient algorithms [Z. Angew. Math. Mech. 75, No.~1, 69--77 (1995; Zbl 0823.65061)], this paper applies the trust region technique to the conjugate gradient methods. The new algorithm has both global convergence and good numerical performance in practical computation. The theoretical analysis and numerical results show that the proposal algorithm is promising and can solve ill conditional minimization problems. The new method can generate an adequate trust region radius automatically at each iteration and also has a linear convergence rate under some mild conditions. Numerical results show that the new conjugate method is superior to other simple conjugate methods in many situations.
0 references
unconstraint optimization
0 references
global convergence
0 references
linear convergence
0 references