Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (Q5931897): Difference between revisions
From MaRDI portal
ReferenceBot (talk | contribs) Changed an Item |
Created claim: Wikidata QID (P12): Q126226942, #quickstatements; #temporary_batch_1718023025760 |
||
Property / Wikidata QID | |||
Property / Wikidata QID: Q126226942 / rank | |||
Normal rank |
Revision as of 13:41, 10 June 2024
scientific article; zbMATH DE number 1594637
Language | Label | Description | Also known as |
---|---|---|---|
English | Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications |
scientific article; zbMATH DE number 1594637 |
Statements
Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (English)
0 references
6 May 2001
0 references
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound, For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
0 references
unconstrained optimization
0 references
convergence
0 references
nonlinear conjugate gradient methods
0 references
Fletcher-Reeves algorithm
0 references
Polak-Ribiére algorithm
0 references
Hestenes-Stiefel algorithm
0 references
0 references