Another hybrid conjugate gradient algorithm for unconstrained optimization
From MaRDI portal
Publication:2481406
DOI10.1007/s11075-007-9152-9zbMath1141.65041OpenAlexW2032668178MaRDI QIDQ2481406
Publication date: 9 April 2008
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-007-9152-9
unconstrained optimizationconvergencenumerical comparisonshybrid conjugate gradient methodNewton direction
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (41)
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations ⋮ New hybrid conjugate gradient method for unconstrained optimization ⋮ A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing ⋮ An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations ⋮ Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ Two families of hybrid conjugate gradient methods with restart procedures and their applications ⋮ Another hybrid approach for solving monotone operator equations and application to signal processing ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum ⋮ A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ An efficient new hybrid CG-method as convex combination of DY and CD and HS algorithms ⋮ A hybrid Riemannian conjugate gradient method for nonconvex optimization problems ⋮ Two effective hybrid conjugate gradient algorithms based on modified BFGS updates ⋮ An improved three-term derivative-free method for solving nonlinear equations ⋮ Two modified three-term conjugate gradient methods with sufficient descent property ⋮ FR type methods for systems of large-scale nonlinear monotone equations ⋮ An efficient hybrid conjugate gradient method for unconstrained optimization ⋮ A conjugate gradient method for unconstrained optimization problems ⋮ Applying powell's symmetrical technique to conjugate gradient methods ⋮ Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ Improved conjugate gradient method for nonlinear system of equations ⋮ New hybrid conjugate gradient method as a convex combination of LS and CD methods ⋮ Erratum to: ``Comments on `Another hybrid conjugate gradient algorithm for unconstrained optimization' by Andrei ⋮ Behavior of the combination of PRP and HZ methods for unconstrained optimization ⋮ A modified spectral conjugate gradient method with global convergence ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ Comments on “A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter” ⋮ Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods” ⋮ A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter ⋮ Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei ⋮ A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations ⋮ Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence result for conjugate gradient methods
- Scaled conjugate gradient algorithms for unconstrained optimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Another hybrid conjugate gradient algorithm for unconstrained optimization