A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
DOI10.1137/030601880zbMATH Open1093.90085OpenAlexW2018215034MaRDI QIDQ5317554FDOQ5317554
Authors: William Hager, Hongchao Zhang
Publication date: 16 September 2005
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/45efe38ea906de376db50a86b8a11ef566821e42
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article; zbMATH DE number 179266
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
convergencenonlinear programmingglobal convergenceunconstrained optimizationconjugate gradient methodline searchWolfe conditionsCUTE
Cited In (only showing first 100 items - show all)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- A new hybrid conjugate gradient method of unconstrained optimization methods
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- A conjugate gradient algorithm and its applications in image restoration
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Conjugate gradient methods using value of objective function for unconstrained optimization
- A derivative-free three-term Hestenes–Stiefel type method for constrained nonlinear equations and image restoration
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem
- A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- A modified sufficient descent Polak-Ribiére-Polyak type conjugate gradient method for unconstrained optimization problems
- New hybrid conjugate gradient method as a convex combination of LS and FR methods
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing \(\ell_1\) regularized problem
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A modified Polak-Ribière-Polyak descent method for unconstrained optimization
- A new family of conjugate gradient methods for unconstrained optimization
- A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- Unconstrained direct optimization of spacecraft trajectories using many embedded Lambert problems
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- The Hager-Zhang conjugate gradient algorithm for large-scale nonlinear equations
- Behavior of the combination of PRP and HZ methods for unconstrained optimization
- A new adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- Descent three-term DY-type conjugate gradient methods for constrained monotone equations with application
- Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
- Least-squares-based three-term conjugate gradient methods
- On large-scale unconstrained optimization and arbitrary regularization
- A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection
- Nonlinearly preconditioned optimization on Grassmann manifolds for computing approximate Tucker tensor decompositions
- A scaled three-term conjugate gradient method for unconstrained optimization
- The effects of control domain position on optimal control of cardiac arrhythmia
- Automatic human gait imitation and recognition in 3D from monocular video with an uncalibrated camera
- Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery
- A dynamic extremum seeking scheme for three-player attack-defense with unknown gradient
- PRP-like algorithm for monotone operator equations
- The projection technique for two open problems of unconstrained optimization problems
- A survey of gradient methods for solving nonlinear optimization
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems
- A three-term conjugate gradient method with accelerated subspace quadratic optimization
- Global convergence of Hager-Zhang type Riemannian conjugate gradient method
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems
- A fully non-linear optimization approach to acousto-electric tomography
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Quantum optimal control using the adjoint method
- A robust extremum seeking scheme for dynamic systems with uncertainties and disturbances
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- Modeling and control through leadership of a refined flocking system
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A conjugate gradient method with descent direction for unconstrained optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Algorithm 851
- Applying powell's symmetrical technique to conjugate gradient methods
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Scaled conjugate gradient algorithms for unconstrained optimization
- Two modified scaled nonlinear conjugate gradient methods
- A modified Hestenes-Stiefel conjugate gradient algorithm for large-scale optimization
- A limited memory BFGS-type method for large-scale unconstrained optimization
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- A spectral dai-yuan-type conjugate gradient method for unconstrained optimization
Uses Software
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)