A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
DOI10.1007/S12190-022-01724-ZzbMATH Open1502.65035OpenAlexW4293100925MaRDI QIDQ2103175FDOQ2103175
Mengxiang Zhang, Jiajia Yu, Gonglin Yuan, Ailun Jian
Publication date: 13 December 2022
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-022-01724-z
Recommendations
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- On the global convergence of the Hager-Zhang conjugate gradient method with Armijo line search
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Cites Work
- Algorithm 851
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- Scaled conjugate gradient algorithms for unconstrained optimization
- Title not available (Why is that?)
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A method for the solution of certain non-linear problems in least squares
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A class of derivative-free methods for large-scale nonlinear monotone equations
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- A new trust-region method with line search for solving symmetric nonlinear equations
- A spectral conjugate gradient method for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Convergence Properties of Algorithms for Nonlinear Optimization
- Convergence Properties of the BFGS Algoritm
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Title not available (Why is that?)
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A new adaptive trust region algorithm for optimization problems
- An adaptive trust region algorithm for large-residual nonsmooth least squares problems
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A novel parameter estimation method for muskingum model using new Newton-type trust region algorithm
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- An effective adaptive trust region algorithm for nonsmooth minimization
- The projection technique for two open problems of unconstrained optimization problems
Cited In (2)
Uses Software
This page was built for publication: A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2103175)