Two descent hybrid conjugate gradient methods for optimization (Q2483351): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Import241208061232 (talk | contribs)
Normalize DOI.
 
(7 intermediate revisions by 5 users not shown)
Property / DOI
 
Property / DOI: 10.1016/j.cam.2007.04.028 / rank
Normal rank
 
Property / describes a project that uses
 
Property / describes a project that uses: CUTEr / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: SCALCG / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTE / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.cam.2007.04.028 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1964845142 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scaled conjugate gradient algorithms for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A spectral conjugate gradient method for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: An efficient hybrid conjugate gradient method for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4226179 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient generalized conjugate gradient algorithms. I: Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3313210 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient hybrid conjugate gradient techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4103338 / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1016/J.CAM.2007.04.028 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 22:37, 18 December 2024

scientific article
Language Label Description Also known as
English
Two descent hybrid conjugate gradient methods for optimization
scientific article

    Statements

    Two descent hybrid conjugate gradient methods for optimization (English)
    0 references
    0 references
    0 references
    28 April 2008
    0 references
    The aim of the paper is to study convergence and computational properties of two new descent hybrid conjugate gradient methods for nonlinear optimization problems consisting in the global minimization of a continuously differentiable function of \(n\) variables over \(\mathbb{R}^n\). The methods require no restarts and produce a sufficient descent search direction in each iteration. No convexity assumptions are required. The obtained results hold for functions with bounded level sets and bounded Lipschitz continuous gradients. The numerical results presented at the end of the paper show a good efficiency of the proposed methods.
    0 references
    conjugate gradient method
    0 references
    descent direction
    0 references
    global convergence
    0 references
    global optimization
    0 references
    nonlinear optimization
    0 references
    numerical results
    0 references
    0 references
    0 references
    0 references

    Identifiers