An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation (Q1039699): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Import241208061232 (talk | contribs)
Normalize DOI.
 
(2 intermediate revisions by 2 users not shown)
Property / DOI
 
Property / DOI: 10.1016/j.amc.2009.08.016 / rank
Normal rank
 
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.amc.2009.08.016 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1966392521 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: A three-parameter family of nonlinear conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient generalized conjugate gradient algorithms. I: Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Line search algorithms with guaranteed sufficient decrease / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: The convergence properties of some new conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A note about WYL's conjugate gradient method and its applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: A modified PRP conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some descent three-term conjugate gradient methods and their global convergence / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1016/J.AMC.2009.08.016 / rank
 
Normal rank

Latest revision as of 14:25, 10 December 2024

scientific article
Language Label Description Also known as
English
An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
scientific article

    Statements

    An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation (English)
    0 references
    0 references
    23 November 2009
    0 references
    The author considers the unconstrained minimization problem consisting in a minimizing continuously differentiable function \(f\) defined on \(\mathbb{R}^n\). Conjugate gradient methods knows from the literature are compared and two slight modifications of \textit{Z. Wei}, \textit{S. Yao} and \textit{L. Liu}'s nonlinear conjugate method [Appl. Math. Comput. 183, No.~2, 1341--1350 (2006; Zbl 1116.65073)] are proposed. The modified methods posses better convergence properties and converge globally if the strong Wolfe line search with a restriction on one of its parameters is used. The second of the two methods is proved to be globally convergent even if the standard Wolfe line search is used. Numerical results reported in the concluding part of the paper show that the methods are efficient for problems from the CUTE library [see \textit{I. Bongartz}, \textit{A. R. Conn}, \textit{N. Gould} and \textit{Ph. L. Toint}, CUTE: constrained and unconstrained testing environments, ACM Trans. Math. Softw 21, No. 1, 123--160 (1995; Zbl 0886.65058)]. The efficiency of the proposed methods is compared with the efficiency of some other conjugate gradient methods.
    0 references
    conjugate gradient method
    0 references
    descent direction
    0 references
    global convergence
    0 references
    Wolfe line
    0 references
    numerical results
    0 references
    efficiency
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers