A limited memory descent Perry conjugate gradient method
From MaRDI portal
Publication:518141
DOI10.1007/s11590-015-0979-zzbMath1365.90242OpenAlexW2276689698MaRDI QIDQ518141
Ioannis E. Livieris, Panagiotis Pintelas
Publication date: 28 March 2017
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-015-0979-z
Related Items (2)
A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ An improved Perry conjugate gradient method with adaptive parameter choice
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A modified CG-DESCENT method for unconstrained optimization
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A modified Perry conjugate gradient method and its global convergence
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Two descent hybrid conjugate gradient methods for optimization
- Reduced-Hessian Quasi-Newton Methods for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Two-Point Step Size Gradient Methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Numerical Optimization
- CUTE
- Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization
- TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A limited memory descent Perry conjugate gradient method