Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (Q849150): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(8 intermediate revisions by 5 users not shown)
Property / reviewed by
 
Property / reviewed by: Karel Zimmermann / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Karel Zimmermann / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTE / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: L-BFGS / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTEr / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11075-009-9318-8 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2081790497 / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: A spectral conjugate gradient method for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New conjugacy conditions and related nonlinear conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the limited memory BFGS method for large scale optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition / rank
 
Normal rank
Property / cites work
 
Property / cites work: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: New quasi-Newton equation and related methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 11:20, 2 July 2024

scientific article
Language Label Description Also known as
English
Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
scientific article

    Statements

    Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (English)
    0 references
    0 references
    0 references
    24 February 2010
    0 references
    The authors consider unconstrained optimization problems with a continuously differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of modified conjugate gradient methods is proposed for solving the problems. The methods in this class have a common property that the direction \(d_k\) generated at iteration \(k\) and corresponding gradient \(g_k\) of function \(f\) satisfy the equality \(g_k^Tdk= -\| g_k\|^2\). Global convergence for modified methods \(YT\) and \(YT\)+, which belong to the proposed class of methods, is proved under suitable conditions. Extensive numerical experiments show the efficiency of the proposed methods. The numerical experiments are carried out using test problems from CUTE library.
    0 references
    conjugate gradient method
    0 references
    line search
    0 references
    global convergence
    0 references
    unconstrained optimization
    0 references
    0 references
    0 references
    0 references

    Identifiers