A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
DOI10.1137/030601880zbMATH Open1093.90085OpenAlexW2018215034MaRDI QIDQ5317554FDOQ5317554
Authors: William Hager, Hongchao Zhang
Publication date: 16 September 2005
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/45efe38ea906de376db50a86b8a11ef566821e42
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
convergencenonlinear programmingglobal convergenceunconstrained optimizationconjugate gradient methodline searchWolfe conditionsCUTE
Cited In (only showing first 100 items - show all)
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Quantum optimal control using the adjoint method
- A robust extremum seeking scheme for dynamic systems with uncertainties and disturbances
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- Modeling and control through leadership of a refined flocking system
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses
- A conjugate gradient method with descent direction for unconstrained optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Algorithm 851
- Applying powell's symmetrical technique to conjugate gradient methods
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Scaled conjugate gradient algorithms for unconstrained optimization
- Two modified scaled nonlinear conjugate gradient methods
- A limited memory BFGS-type method for large-scale unconstrained optimization
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- A spectral dai-yuan-type conjugate gradient method for unconstrained optimization
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- An efficient multigrid strategy for large-scale molecular mechanics optimization
- A second-order gradient method for convex minimization
- A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems
- A practical PR+ conjugate gradient method only using gradient
- A three-term derivative-free projection method for nonlinear monotone system of equations
- Convergence properties of a class of nonlinear conjugate gradient methods
- A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization
- A Barzilai-Borwein conjugate gradient method
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- Barzilai-Borwein-like methods for the extreme eigenvalue problem
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- A descent extension of the Polak-Ribière-Polyak conjugate gradient method
- Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
- A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- On three-term conjugate gradient algorithms for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A smoothing conjugate gradient method for solving systems of nonsmooth equations
- An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems
- Inverse determination of a heat source from natural convection in a porous cavity
- A self-adjusting spectral conjugate gradient method for large-scale unconstrained optimization
- Discrete second order adjoints in atmospheric chemical transport modeling
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization
- Symmetric Perry conjugate gradient method
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A Two-Term PRP-Based Descent Method
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Some descent three-term conjugate gradient methods and their global convergence
- Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations
- A COKOSNUT code for the control of the time-dependent Kohn-Sham model
- Conjugate gradient type methods for the nondifferentiable convex minimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations
- A Regularization Approach for an Inverse Source Problem in Elliptic Systems from Single Cauchy Data
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- Proximal methods for nonlinear programming: Double regularization and inexact subproblems
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Title not available (Why is that?)
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- The Variational Gaussian Approximation Revisited
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Global convergence of a nonlinear conjugate gradient method
- A trust region algorithm with conjugate gradient technique for optimization problems
- Two modified DY conjugate gradient methods for unconstrained optimization problems
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A descent family of Dai–Liao conjugate gradient methods
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- A modified conjugate gradient method based on a modified secant equation
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
Uses Software
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)