Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
From MaRDI portal
Publication:1036299
DOI10.1155/2009/243290zbMath1184.65066OpenAlexW2076396137WikidataQ58648924 ScholiaQ58648924MaRDI QIDQ1036299
Yun-hai Xiao, Jian Guo Zhang, Zeng-xin Wei
Publication date: 13 November 2009
Published in: Mathematical Problems in Engineering (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/45812
unconstrained optimizationglobal convergencecomparison of methodsnumerical resultslarge-scalePRP methodnonlinear conjugate gradient-type methods
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items
A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem, DTSMA: dominant swarm with adaptive T-distribution mutation-based slime mould algorithm, Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique, A simple sufficient descent method for unconstrained optimization, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, Global convergence of a nonlinear conjugate gradient method, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, Global convergence of some modified PRP nonlinear conjugate gradient methods, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, Sufficient descent Polak-Ribière-Polyak conjugate gradient algorithm for large-scale box-constrained optimization, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems, A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations, On three-term conjugate gradient algorithms for unconstrained optimization, A new three-term conjugate gradient algorithm for unconstrained optimization, Spectral three-term constrained conjugate gradient algorithm for function minimizations, Descent Symmetrization of the Dai–Liao Conjugate Gradient Method, Some three-term conjugate gradient methods with the new direction structure, Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions
- A limited memory BFGS-type method for large-scale unconstrained optimization
- A conjugate direction algorithm without line searches
- New quasi-Newton equation and related methods for unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Scaled conjugate gradient algorithms for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Another nonlinear conjugate gradient algorithm for unconstrained optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Global convergence of conjugate gradient methods without line search