A conjugate gradient method with global convergence for large-scale unconstrained optimization problems (Q1790099): Difference between revisions
From MaRDI portal
Set profile property. |
Created claim: DBLP publication ID (P1635): journals/jam/YaoLW13, #quickstatements; #temporary_batch_1731483406851 |
||
(One intermediate revision by one other user not shown) | |||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1155/2013/730454 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2087504876 / rank | |||
Normal rank | |||
Property / DBLP publication ID | |||
Property / DBLP publication ID: journals/jam/YaoLW13 / rank | |||
Normal rank |
Latest revision as of 08:55, 13 November 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A conjugate gradient method with global convergence for large-scale unconstrained optimization problems |
scientific article |
Statements
A conjugate gradient method with global convergence for large-scale unconstrained optimization problems (English)
0 references
10 October 2018
0 references
Summary: The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. This paper proposes a conjugate gradient method which is similar to Dai-Liao conjugate gradient method [\textit{Y. H. Dai} and \textit{L. Z. Liao}, Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] but has stronger convergence properties. The given method possesses the sufficient descent condition, and is globally convergent under strong Wolfe-Powell (SWP) line search for general function. Our numerical results show that the proposed method is very efficient for the test problems.
0 references