A conjugate gradient method with descent direction for unconstrained optimization
From MaRDI portal
Publication:732160
DOI10.1016/j.cam.2009.08.001zbMath1179.65075OpenAlexW1970131234MaRDI QIDQ732160
Zeng-xin Wei, Xi-wen Lu, Gong Lin Yuan
Publication date: 9 October 2009
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2009.08.001
unconstrained optimizationglobal convergencenumerical resultsconjugate gradient methodline searchPolak-Ribiere-Polyak methodsearch directionWolfe-Powell ruleZoutendijk condition
Related Items
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations, A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations, A new adaptive trust region algorithm for optimization problems, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, A modified nonmonotone BFGS algorithm for unconstrained optimization, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, Optimization of unconstrained problems using a developed algorithm of spectral conjugate gradient method calculation, Global convergence of a modified spectral conjugate gradient method, The projection technique for two open problems of unconstrained optimization problems, Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization, Global convergence properties of the BBB conjugate gradient method, An active set limited memory BFGS algorithm for bound constrained optimization, The global convergence of a new mixed conjugate gradient method for unconstrained optimization, A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations, Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method, A quasi-Newton algorithm for large-scale nonlinear equations, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, A BFGS algorithm for solving symmetric nonlinear equations, A modified three-term PRP conjugate gradient algorithm for optimization models, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, A modified three-term conjugate gradient method with sufficient descent property, A New Method with Descent Property for Symmetric Nonlinear Equations, Nonlinear conjugate gradient methods for continuous-time output feedback design, Estimation of heat flux in two-dimensional nonhomogeneous parabolic equation based on a sufficient descent Levenberg-Marquardt algorithm, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems, A conjugate gradient algorithm and its applications in image restoration, The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique, A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration, The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, Optimal Estimation of Free Energies and Stationary Densities from Multiple Biased Simulations, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions, A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations, A new class of nonlinear conjugate gradient coefficients for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Convergence of line search methods for unconstrained optimization
- New line search methods for unconstrained optimization
- A three-parameter family of nonlinear conjugate gradient methods
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization