Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
From MaRDI portal
Publication:849150
DOI10.1007/s11075-009-9318-8zbMath1185.65097OpenAlexW2081790497MaRDI QIDQ849150
Publication date: 24 February 2010
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-009-9318-8
Related Items
A memory gradient method based on the nonmonotone technique, Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations, A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence, A spectral conjugate gradient method for solving large-scale unconstrained optimization, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Matrix Structures in Queuing Models, Globally convergent modified Perry's conjugate gradient method, Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations, FR type methods for systems of large-scale nonlinear monotone equations, A new class of nonmonotone conjugate gradient training algorithms, An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction, A limited memory descent Perry conjugate gradient method, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A nonmonotone supermemory gradient algorithm for unconstrained optimization, A conjugate gradient method with sufficient descent property, A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.