Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
DOI10.1016/J.AMC.2011.12.091zbMATH Open1254.65074OpenAlexW2122859623MaRDI QIDQ433285FDOQ433285
Authors: Fenghua Wen, Zhifeng Dai
Publication date: 13 July 2012
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2011.12.091
Recommendations
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Global convergence of a nonlinear conjugate gradient method
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization
- scientific article; zbMATH DE number 6310924
global convergenceHestenes-Stiefel methodunconstrained optimizationnumerical experimentsconjugate gradient methodArmijo line searchsufficient descent propertyWolfe line searchPolak-Ribière-Polyak method
Cites Work
- CUTE
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- Line search algorithms with guaranteed sufficient decrease
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- The convergence properties of some new conjugate gradient methods
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- A note about WYL's conjugate gradient method and its applications
- Title not available (Why is that?)
Cited In (39)
- Multiple solutions of second-order damped impulsive differential equations with mixed boundary conditions
- A new method with sufficient descent property for unconstrained optimization
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- Application of a globally convergent hybrid conjugate gradient method in portfolio optimization
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- Spectral modified Polak-Ribiére-Polyak projection conjugate gradient method for solving monotone systems of nonlinear equations
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- Two modified DY conjugate gradient methods for unconstrained optimization problems
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- Hybrid random batch idea and nonlinear conjugate gradient method for accelerating charged polymer dynamics simulation
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property
- Delay dynamic double integral inequalities on time scales with applications
- New estimates of \(q_1 q_2\)-Ostrowski-type inequalities within a class of \(n\)-polynomial prevexity of functions
- On new unified bounds for a family of functions via fractional \(q\)-calculus theory
- On new modifications governed by quantum Hahn's integral operator pertaining to fractional calculus
- Some new Hermite-Hadamard-type inequalities associated with conformable fractional integrals and their applications
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- Some new fractional integral inequalities for exponentially \(m\)-convex functions via extended generalized Mittag-Leffler function
- An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- Gradient-based parameter identification algorithms for observer canonical state space systems using state estimates
- Two efficient nonlinear conjugate gradient methods for Riemannian manifolds
- Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization
- A class of one parameter conjugate gradient methods
- A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization
- Some new inequalities involving \(\kappa \)-fractional integral for certain classes of functions and their applications
- Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method
- Inequalities for generalized trigonometric and hyperbolic functions with one parameter
- A hybrid of DL and WYL nonlinear conjugate gradient methods
- A PRP type conjugate gradient method without truncation for nonconvex vector optimization
- Optimal two-parameter geometric and arithmetic mean bounds for the Sándor-Yang mean
- An efficient modified conjugate gradient algorithm under Wolfe conditions with applications in compressive sensing
- A survey of gradient methods for solving nonlinear optimization
- A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems
Uses Software
This page was built for publication: Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q433285)