Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
DOI10.1007/S11075-019-00709-7zbMATH Open1436.90109OpenAlexW2942560097WikidataQ127929528 ScholiaQ127929528MaRDI QIDQ2299208FDOQ2299208
Publication date: 20 February 2020
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00709-7
nonconvex optimizationglobal convergenceconjugate gradient methodweak-Wolfe-Powell line search techniquelarge-scale optimization problem
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- Optimization theory and methods. Nonlinear programming
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A conjugate direction algorithm without line searches
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A spectral conjugate gradient method for unconstrained optimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- A descent family of Dai–Liao conjugate gradient methods
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- A modified three-term PRP conjugate gradient algorithm for optimization models
- A new three-term conjugate gradient method
- New three-term conjugate gradient method with guaranteed global convergence
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An inexact Newton method for solving complementarity problems in hydrodynamic lubrication
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- A scaled three-term conjugate gradient method for unconstrained optimization
- A scaled conjugate gradient method for nonlinear unconstrained optimization
Cited In (14)
- A Five-Parameter Class of Derivative-Free Spectral Conjugate Gradient Methods for Systems of Large-Scale Nonlinear Monotone Equations
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- Some improved Dai-Yuan conjugate gradient methods for large-scale unconstrained optimization problems
- An efficient projection algorithm for large-scale system of monotone nonlinear equations with applications in signal recovery
- A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions
- A new smoothing spectral conjugate gradient method for solving tensor complementarity problems
- An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method
- A three-term projection method based on spectral secant equation for nonlinear monotone equations
- Scaled three-term derivative-free methods for solving large-scale nonlinear monotone equations
- A modified scaled spectral-conjugate gradient-based algorithm for solving monotone operator equations
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
- Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization
- Modified spectral PRP conjugate gradient method for solving tensor eigenvalue complementarity problems
- A spectral gradient-type derivative-free projection algorithm for solving nonlinear convex constrained equations with its application
Uses Software
This page was built for publication: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2299208)