Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
From MaRDI portal
Publication:2390003
DOI10.1016/j.cam.2008.12.024zbMath1170.65046MaRDI QIDQ2390003
Publication date: 20 July 2009
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2008.12.024
algorithm; unconstrained optimization; convergence acceleration; conjugate gradient method; numerical comparisons; Newton directions; forward difference approximation of Hessian/vector product; large scale test examples
65K05: Numerical mathematical programming methods
90C06: Large-scale problems in mathematical programming
90C30: Nonlinear programming
Related Items
Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Preconditioning of Truncated-Newton Methods
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- CUTE
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On Steepest Descent
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization