Acceleration of conjugate gradient algorithms for unconstrained optimization (Q1029371): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Import241208061232 (talk | contribs)
Normalize DOI.
 
(2 intermediate revisions by 2 users not shown)
Property / DOI
 
Property / DOI: 10.1016/j.amc.2009.03.020 / rank
Normal rank
 
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.amc.2009.03.020 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1987327258 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scaled conjugate gradient algorithms for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3539529 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimization of functions having Lipschitz continuous first partial derivatives / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4821526 / rank
 
Normal rank
Property / cites work
 
Property / cites work: New conjugacy conditions and related nonlinear conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3702408 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4226179 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Steepest Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3908413 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the limited memory BFGS method for large scale optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficent line search algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4107408 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4] / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1016/J.AMC.2009.03.020 / rank
 
Normal rank

Latest revision as of 13:45, 10 December 2024

scientific article
Language Label Description Also known as
English
Acceleration of conjugate gradient algorithms for unconstrained optimization
scientific article

    Statements

    Acceleration of conjugate gradient algorithms for unconstrained optimization (English)
    0 references
    10 July 2009
    0 references
    A new approach for the acceleration of conjugate gradient methods is presented. The proposed method is based on the modification of the steplength \(\alpha _k\), computed by Wolfe line search conditions, by means of a positive parameter \(\eta _k\), for the improvement of the behavior of the classical conjugate gradient algorithms which are mainly applied to large-scale unconstrained optimization. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms.
    0 references
    Wolfe line search
    0 references
    line search gradient methods
    0 references
    unconstrained optimization
    0 references
    convergence acceleration
    0 references
    numerical examples
    0 references
    conjugate gradient methods
    0 references
    large-scale
    0 references
    convergence
    0 references
    numerical comparisons
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers