On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization (Q1652787): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11075-017-0397-7 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2748382761 / rank
 
Normal rank

Revision as of 01:33, 20 March 2024

scientific article
Language Label Description Also known as
English
On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
scientific article

    Statements

    On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization (English)
    0 references
    0 references
    0 references
    0 references
    16 July 2018
    0 references
    One of the effective methods for solving the optimization problem is conjugate gradient method. For line search, the Fletcher-Reeves (FR) conjugate gradient method was considered by \textit{Y. H. Dai} and \textit{Y. Yuan} [IMA J. Numer. Anal. 16, No. 2, 155--164 (1996; Zbl 0851.65049)]. They obtained a generalized FR method which is called GFR method. Based on the generalized FR method, the authors present an \(s\)-dependent GFR method for unconstrained optimization problem. They also obtain two different kinds of estimations of upper bounds of a parameter which fulfils the FR-formula. Using several step-size rules, the global convergence of \(s\)-dependent GFR conjugate method is proved.
    0 references
    step-length
    0 references
    linear search
    0 references
    global convergence
    0 references
    conjugate gradient
    0 references

    Identifiers