Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization (Q2390003): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(11 intermediate revisions by 4 users not shown)
Property / reviewed by
 
Property / reviewed by: Bülent Karasözen / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Bülent Karasözen / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: Algorithm 500 / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTE / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: L-BFGS / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTEr / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: tn / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: SCALCG / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CONMIN / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.cam.2008.12.024 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2090016606 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4] / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scaled conjugate gradient algorithms for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Preconditioning of Truncated-Newton Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods. II: Some Corrections / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3125512 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3539529 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Conjugate Gradient Method for Linear and Nonlinear Operator Equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Properties of Nonlinear Conjugate Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the limited memory BFGS method for large scale optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Steepest Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: An efficient hybrid conjugate gradient method for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New conjugacy conditions and related nonlinear conjugate gradient methods / rank
 
Normal rank

Latest revision as of 19:55, 1 July 2024

scientific article
Language Label Description Also known as
English
Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
scientific article

    Statements

    Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization (English)
    0 references
    0 references
    20 July 2009
    0 references
    A new algoritm for acceleration of the conjugate gradient method is developed by computing the parameter \(\beta_k\) using a finite difference approximation of the Hessian/vector product. The search direction is also computed using the forward difference approximation of the Hessian/vector product. In contrast to the Newton and quasi Newton methods, for the conjugate gradient method the step lengths may differ from 1 depending how the problem is scaled. The authors suggests a modification of the step length which reduces number of function evaluations compared to the currently available conjugate gradient algorithms. It is proved that the method is globally convergent and its convergence rate is linear for uniformly convex functions; but reduction of function values is significantly improved. The performance of the suggested method is demonstrated for unconstrained 750 large scale test problems by comparing it with the conjugate algorithms like CONMIN, SCALCG and the truncated Newton methods.
    0 references
    0 references
    0 references
    0 references
    0 references
    unconstrained optimization
    0 references
    conjugate gradient method
    0 references
    Newton directions
    0 references
    large scale test examples
    0 references
    numerical comparisons
    0 references
    convergence acceleration
    0 references
    algorithm
    0 references
    forward difference approximation of Hessian/vector product
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references