Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
DOI10.1007/s11075-019-00709-7zbMath1436.90109OpenAlexW2942560097WikidataQ127929528 ScholiaQ127929528MaRDI QIDQ2299208
Publication date: 20 February 2020
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00709-7
global convergenceconjugate gradient methodnonconvex optimizationweak-Wolfe-Powell line search techniquelarge-scale optimization problem
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Related Items (10)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- A modified three-term PRP conjugate gradient algorithm for optimization models
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A new three-term conjugate gradient method
- A conjugate direction algorithm without line searches
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- An inexact Newton method for solving complementarity problems in hydrodynamic lubrication
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- A scaled three-term conjugate gradient method for unconstrained optimization
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- New three-term conjugate gradient method with guaranteed global convergence
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- A scaled conjugate gradient method for nonlinear unconstrained optimization
- An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization