Efficient generalized conjugate gradient algorithms. I: Theory
DOI10.1007/BF00940464zbMATH Open0702.90077MaRDI QIDQ915657FDOQ915657
Authors: Elsie Sterbin Gottlieb
Publication date: 1991
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Recommendations
global convergenceunconstrained optimizationconjugacygeneralized conjugate gradient methodinexact line search
Nonlinear programming (90C30) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Cites Work
- Function minimization by conjugate gradients
- Title not available (Why is that?)
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Efficient generalized conjugate gradient algorithms. II: Implementation
- Comparison of some conjugate direction procedures for function minimization
- Globally convergent conjugate gradient algorithms
Cited In (only showing first 100 items - show all)
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A new hybrid algorithm for convex nonlinear unconstrained optimization
- GLOBAL CONVERGENCE OF SHORTEST-RESIDUAL FAMILY OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization
- The convergence of conjugate gradient method with nonmonotone line search
- A modified form of conjugate gradient method for unconstrained optimization problems
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- A modified spectral conjugate gradient method for solving unconstrained minimization problems
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- Two spectral conjugate gradient methods for unconstrained optimization problems
- A new version of the Liu-Storey conjugate gradient method
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A generalized conjugate gradient algorithm
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Two descent hybrid conjugate gradient methods for optimization
- Some three-term conjugate gradient methods with the inexact line search condition
- A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
- Two modified Polak-Ribière-Polyak-type nonlinear conjugate methods with sufficient descent property
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
- Global convergence of two kinds of three-term conjugate gradient methods without line search
- A conjugate gradient method for unconstrained optimization problems
- A note about WYL's conjugate gradient method and its applications
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- A new family of conjugate gradient methods
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- Two new conjugate gradient methods based on modified secant equations
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Surrogate gradient algorithm for Lagrangian relaxation
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- New step lengths in conjugate gradient methods
- Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses
- Penalty algorithm based on conjugate gradient method for solving portfolio management problem
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- A new two-parameter family of nonlinear conjugate gradient methods
- Convergence properties of a class of nonlinear conjugate gradient methods
- A modified hybrid conjugate gradient method for unconstrained optimization
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- Some three-term conjugate gradient methods with the new direction structure
- An extension of the Fletcher-Reeves method to linear equality constrained optimization problem
- Convergence of the descent Dai-Yuan conjugate gradient method for unconstrained optimization
- A three-parameter family of nonlinear conjugate gradient methods
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A family of hybrid conjugate gradient methods for unconstrained optimization
- Title not available (Why is that?)
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- On three-term conjugate gradient algorithms for unconstrained optimization
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- New three-term conjugate gradient method with guaranteed global convergence
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- Symmetric Perry conjugate gradient method
- Convergence of Liu-Storey conjugate gradient method
- A Two-Term PRP-Based Descent Method
- Présentation de synthèse des méthodes de gradient conjugué
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- A modified CG-DESCENT method for unconstrained optimization
- A fast inertial self-adaptive projection based algorithm for solving large-scale nonlinear monotone equations
- New spectral LS conjugate gradient method for nonlinear unconstrained optimization
- Global convergence of a modified spectral FR conjugate gradient method
- A new class of nonlinear conjugate gradient coefficients with global convergence properties
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. II: Implementation
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
- The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- A new hybrid PRPFR conjugate gradient method for solving nonlinear monotone equations and image restoration problems
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A new hybrid conjugate gradient method of unconstrained optimization methods
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- A modified three-term type CD conjugate gradient algorithm for unconstrained optimization problems
- An accelerated conjugate gradient algorithm for solving nonlinear monotone equations and image restoration problems
- A global convergence of LS-CD hybrid conjugate gradient method
- A conjugate gradient algorithm and its applications in image restoration
- Linear convergence of descent methods for the unconstrained minimization of restricted strongly convex functions
- A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function
This page was built for publication: Efficient generalized conjugate gradient algorithms. I: Theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q915657)