Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
From MaRDI portal
Publication:2390003
Recommendations
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 992790 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- An efficient hybrid conjugate gradient method for unconstrained optimization
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- CUTE
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New conjugacy conditions and related nonlinear conjugate gradient methods
- On Steepest Descent
- On the limited memory BFGS method for large scale optimization
- Preconditioning of Truncated-Newton Methods
- Scaled conjugate gradient algorithms for unconstrained optimization
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- The conjugate gradient method in extremal problems
Cited in
(12)- Modeling Hessian-vector products in nonlinear optimization: new Hessian-free methods
- A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
- Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- A new class of nonlinear conjugate gradient coefficients with global convergence properties
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- Convergence acceleration of direct trajectory optimization using novel Hessian calculation methods
This page was built for publication: Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2390003)