Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
From MaRDI portal
Publication:433285
DOI10.1016/j.amc.2011.12.091zbMath1254.65074OpenAlexW2122859623MaRDI QIDQ433285
Publication date: 13 July 2012
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2011.12.091
unconstrained optimizationglobal convergencenumerical experimentsconjugate gradient methodWolfe line searchArmijo line searchsufficient descent propertyPolak-Ribière-Polyak methodHestenes-Stiefel method
Related Items (33)
An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property ⋮ Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property ⋮ Delay dynamic double integral inequalities on time scales with applications ⋮ A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems ⋮ A class of one parameter conjugate gradient methods ⋮ An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search ⋮ Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search ⋮ Some new inequalities involving \(\kappa \)-fractional integral for certain classes of functions and their applications ⋮ Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems ⋮ A new modified three-term conjugate gradient method with sufficient descent property and its global convergence ⋮ A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems ⋮ Inequalities for generalized trigonometric and hyperbolic functions with one parameter ⋮ Hybrid random batch idea and nonlinear conjugate gradient method for accelerating charged polymer dynamics simulation ⋮ New estimates of \(q_1 q_2\)-Ostrowski-type inequalities within a class of \(n\)-polynomial prevexity of functions ⋮ On new unified bounds for a family of functions via fractional \(q\)-calculus theory ⋮ On new modifications governed by quantum Hahn's integral operator pertaining to fractional calculus ⋮ A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ A hybrid of DL and WYL nonlinear conjugate gradient methods ⋮ Multiple solutions of second-order damped impulsive differential equations with mixed boundary conditions ⋮ A new method with sufficient descent property for unconstrained optimization ⋮ An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction ⋮ Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search ⋮ Spectral modified Polak-Ribiére-Polyak projection conjugate gradient method for solving monotone systems of nonlinear equations ⋮ Gradient-based parameter identification algorithms for observer canonical state space systems using state estimates ⋮ Two modified DY conjugate gradient methods for unconstrained optimization problems ⋮ Some new Hermite-Hadamard-type inequalities associated with conformable fractional integrals and their applications ⋮ Optimal two-parameter geometric and arithmetic mean bounds for the Sándor-Yang mean ⋮ Some new fractional integral inequalities for exponentially \(m\)-convex functions via extended generalized Mittag-Leffler function ⋮ Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization ⋮ Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property ⋮ Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems ⋮ An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- A note about WYL's conjugate gradient method and its applications
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property