A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
From MaRDI portal
Publication:2009756
DOI10.1007/s10092-019-0333-4zbMath1433.90160OpenAlexW2974670071WikidataQ127228703 ScholiaQ127228703MaRDI QIDQ2009756
Keyvan Amini, Parvaneh Faramarzi
Publication date: 29 November 2019
Published in: Calcolo (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10092-019-0333-4
three-term conjugate gradient methodscaling parametersufficient descent propertyArmijo-type line searchmoving asymptotes
Related Items
A new family of hybrid three-term conjugate gradient methods with applications in image restoration ⋮ A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations ⋮ Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A modified spectral conjugate gradient method with global convergence
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A new method of moving asymptotes for large-scale unconstrained optimization
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- The method of moving asymptotes—a new method for structural optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- CUTE
- A globally convergent method of moving asymptotes with trust region technique
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- A modified Hestenes–Stiefel conjugate gradient method with an optimal property
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Global convergence of a nonlinear programming method using convex approximations
- Benchmarking optimization software with performance profiles.
This page was built for publication: A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem