On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (Q483732)

From MaRDI portal





scientific article; zbMATH DE number 6381324
Language Label Description Also known as
default for all languages
No label defined
    English
    On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
    scientific article; zbMATH DE number 6381324

      Statements

      On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (English)
      0 references
      0 references
      17 December 2014
      0 references
      Conjugate gradient (CG) methods comprise a class of unconstrained optimization algorithms characterized by low memory requirements and strong global convergence properties. Although CG methods are not the fastest or most robust optimization algorithms available today, they remain very popular for engineers and mathematicians engaged in solving large-scale problems in the following form: \(\underset{x \in \mathbb R^n}\min f (x)\), where \(f : \mathbb R^n \to \mathbb R\) is a smooth nonlinear function and its gradient is available. Based on an eigenvalue study, the author establishes the sufficient descent condition of an extended class of the Hager-Zhang nonlinear conjugate gradient methods.
      0 references
      0 references
      unconstrained optimization
      0 references
      conjugate gradient algorithm
      0 references
      eigenvalue
      0 references
      sufficient descent condition
      0 references
      global convergence
      0 references
      large-scale problem
      0 references
      Hager-Zhang nonlinear conjugate gradient method
      0 references
      0 references
      0 references

      Identifiers