An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
Publication:2336064
DOI10.1007/S10288-018-0387-1zbMath1425.90134OpenAlexW2893199587WikidataQ129189487 ScholiaQ129189487MaRDI QIDQ2336064
Saman Babaie-Kafaki, Zohre Aminifard
Publication date: 18 November 2019
Published in: 4OR (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10288-018-0387-1
unconstrained optimizationglobal convergenceconjugate gradient methodnonlinear programmingmaximum magnification
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical computation of matrix norms, conditioning, scaling (65F35)
Related Items (14)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- Optimization theory and methods. Nonlinear programming
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Convergence Properties of Algorithms for Nonlinear Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Two optimal Dai–Liao conjugate gradient methods
- A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
- Algorithm 851
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- CUTEr and SifDec
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix