Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (Q5931897): Difference between revisions
From MaRDI portal
Created claim: Wikidata QID (P12): Q126226942, #quickstatements; #temporary_batch_1718023025760 |
Set OpenAlex properties. |
||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1007/bf02669682 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2320871099 / rank | |||
Normal rank |
Latest revision as of 09:59, 30 July 2024
scientific article; zbMATH DE number 1594637
Language | Label | Description | Also known as |
---|---|---|---|
English | Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications |
scientific article; zbMATH DE number 1594637 |
Statements
Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications (English)
0 references
6 May 2001
0 references
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound, For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
0 references
unconstrained optimization
0 references
convergence
0 references
nonlinear conjugate gradient methods
0 references
Fletcher-Reeves algorithm
0 references
Polak-Ribiére algorithm
0 references
Hestenes-Stiefel algorithm
0 references
0 references