A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
From MaRDI portal
Publication:3458820
DOI10.1080/10556788.2014.1001511zbMath1328.90143MaRDI QIDQ3458820
No author found.
Publication date: 28 December 2015
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2014.1001511
unconstrained optimization; conjugate gradient methods; modified secant equation; global convergence property
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
49M37: Numerical methods based on nonlinear programming
Related Items
A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization, An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems, A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems, An improved Perry conjugate gradient method with adaptive parameter choice, A survey of gradient methods for solving nonlinear optimization, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
Uses Software
Cites Work
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Numerical Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- Function minimization by conjugate gradients
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization