Two new conjugate gradient methods for unconstrained optimization (Q2179153)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Two new conjugate gradient methods for unconstrained optimization |
scientific article; zbMATH DE number 7199206
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Two new conjugate gradient methods for unconstrained optimization |
scientific article; zbMATH DE number 7199206 |
Statements
Two new conjugate gradient methods for unconstrained optimization (English)
0 references
12 May 2020
0 references
Summary: The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.
0 references
0 references
0 references
0 references
0 references
0 references
0.925945520401001
0 references
0.9185329079627992
0 references
0.9084922075271606
0 references
0.9079697132110596
0 references
0.906743049621582
0 references