Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (Q5931897): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Set OpenAlex properties.
 
(4 intermediate revisions by 4 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4103338 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3313210 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient hybrid conjugate gradient techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence result for conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Line search algorithms with guaranteed sufficient decrease / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3141900 / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q126226942 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf02669682 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2320871099 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 09:59, 30 July 2024

scientific article; zbMATH DE number 1594637
Language Label Description Also known as
English
Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
scientific article; zbMATH DE number 1594637

    Statements

    Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    6 May 2001
    0 references
    Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound, For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
    0 references
    unconstrained optimization
    0 references
    convergence
    0 references
    nonlinear conjugate gradient methods
    0 references
    Fletcher-Reeves algorithm
    0 references
    Polak-Ribiére algorithm
    0 references
    Hestenes-Stiefel algorithm
    0 references

    Identifiers