A descent family of Dai–Liao conjugate gradient methods
From MaRDI portal
Publication:5746716
DOI10.1080/10556788.2013.833199zbMath1285.90063OpenAlexW2087256975WikidataQ57952757 ScholiaQ57952757MaRDI QIDQ5746716
Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 7 February 2014
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2013.833199
unconstrained optimizationglobal convergencelarge-scale optimizationconjugate gradient algorithmdescent condition
Related Items
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method ⋮ Two optimal Dai–Liao conjugate gradient methods ⋮ Solving nonlinear monotone operator equations via modified SR1 update ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach ⋮ Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme ⋮ An improved Perry conjugate gradient method with adaptive parameter choice ⋮ A descent extension of the Polak-Ribière-Polyak conjugate gradient method ⋮ An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix ⋮ A descent Dai-Liao conjugate gradient method for nonlinear equations ⋮ Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising ⋮ A new black box method for monotone nonlinear equations ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ A derivative-free scaling memoryless DFP method for solving large scale nonlinear monotone equations ⋮ A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations ⋮ A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications ⋮ A new sufficiently descent algorithm for pseudomonotone nonlinear operator equations and signal reconstruction ⋮ Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ Modified Dai-Zuan iterative scheme for nonlinear systems and its application ⋮ A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model ⋮ A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions ⋮ On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications ⋮ Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified conjugate gradient method for general convex functions ⋮ A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems ⋮ A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice ⋮ New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ On optimality of two adaptive choices for the parameter of Dai-Liao method ⋮ A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations ⋮ Two adaptive Dai-Liao nonlinear conjugate gradient methods ⋮ Scaled three-term derivative-free methods for solving large-scale nonlinear monotone equations ⋮ An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem ⋮ An efficient adaptive scaling parameter for the spectral conjugate gradient method ⋮ A new efficient conjugate gradient method for unconstrained optimization ⋮ Spectral three-term constrained conjugate gradient algorithm for function minimizations ⋮ Descent Symmetrization of the Dai–Liao Conjugate Gradient Method ⋮ An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation ⋮ MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD ⋮ Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization ⋮ A class of globally convergent three-term Dai-Liao conjugate gradient methods ⋮ A sufficient descent conjugate gradient method and its global convergence ⋮ Unnamed Item ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations ⋮ A new accelerated conjugate gradient method for large-scale unconstrained optimization ⋮ A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing ℓ1regularized problem ⋮ An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION ⋮ A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice ⋮ A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Modified optimal Perry conjugate gradient method for solving system of monotone equations with applications ⋮ Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
Cites Work
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Using function-values in multi-step quasi-Newton methods
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: A descent family of Dai–Liao conjugate gradient methods