An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
From MaRDI portal
Publication:1626534
DOI10.1007/s10957-018-1377-3zbMath1402.90176OpenAlexW2889379562MaRDI QIDQ1626534
Jianguang Zhu, Lixiang Li, Deren Han, Xiao Liang Dong, Zhi-feng Dai
Publication date: 27 November 2018
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-018-1377-3
global convergencecondition numberconjugacy conditionthree-term conjugate gradient methodsufficient descent condition
Related Items (13)
Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei ⋮ A nonlinear conjugate gradient method using inexact first-order information ⋮ A truncated three-term conjugate gradient method with complexity guarantees with applications to nonconvex regression problem ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ An adaptive modified three-term conjugate gradient method with global convergence ⋮ Adaptive type-2 neural fuzzy sliding mode control of a class of nonlinear systems ⋮ A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction ⋮ Two modified DY conjugate gradient methods for unconstrained optimization problems ⋮ A novel method of dynamic force identification and its application ⋮ Some three-term conjugate gradient methods with the new direction structure ⋮ A modified spectral conjugate gradient method with global convergence
Uses Software
Cites Work
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- Spectral gradient method for impulse noise removal
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- An accurate active set Newton algorithm for large scale bound constrained optimization.
- An active set strategy based on the multiplier function or the gradient.
- Parallel SSLE algorithm for large scale constrained optimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Exploiting damped techniques for nonlinear conjugate gradient methods
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- A decomposition method for large-scale box constrained optimization
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- The global convergence of a modified BFGS method for nonconvex functions
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A modified two-point stepsize gradient algorithm for unconstrained minimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Some descent three-term conjugate gradient methods and their global convergence
- Convergence Conditions for Ascent Methods
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition