Two descent hybrid conjugate gradient methods for optimization (Q2483351): Difference between revisions

From MaRDI portal
Changed an Item
Changed an Item
Property / describes a project that uses
 
Property / describes a project that uses: CUTE / rank
 
Normal rank

Revision as of 13:15, 28 February 2024

scientific article
Language Label Description Also known as
English
Two descent hybrid conjugate gradient methods for optimization
scientific article

    Statements

    Two descent hybrid conjugate gradient methods for optimization (English)
    0 references
    0 references
    0 references
    28 April 2008
    0 references
    The aim of the paper is to study convergence and computational properties of two new descent hybrid conjugate gradient methods for nonlinear optimization problems consisting in the global minimization of a continuously differentiable function of \(n\) variables over \(\mathbb{R}^n\). The methods require no restarts and produce a sufficient descent search direction in each iteration. No convexity assumptions are required. The obtained results hold for functions with bounded level sets and bounded Lipschitz continuous gradients. The numerical results presented at the end of the paper show a good efficiency of the proposed methods.
    0 references
    conjugate gradient method
    0 references
    descent direction
    0 references
    global convergence
    0 references
    global optimization
    0 references
    nonlinear optimization
    0 references
    numerical results
    0 references
    0 references
    0 references
    0 references

    Identifiers