A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
DOI10.1137/030601880zbMATH Open1093.90085OpenAlexW2018215034MaRDI QIDQ5317554FDOQ5317554
Authors: William Hager, Hongchao Zhang
Publication date: 16 September 2005
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/45efe38ea906de376db50a86b8a11ef566821e42
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article; zbMATH DE number 179266
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
convergencenonlinear programmingglobal convergenceunconstrained optimizationconjugate gradient methodline searchWolfe conditionsCUTE
Cited In (only showing first 100 items - show all)
- A modified conjugate gradient method for general convex functions
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem
- On a conjugate directions method for solving strictly convex QP problem
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A sufficient descent Liu–Storey conjugate gradient method and its global convergence
- Combining and scaling descent and negative curvature directions
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- An Liu-Storey-type method for solving large-scale nonlinear monotone equations
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Application of optimal control to the cardiac defibrillation problem using a physiological model of cellular dynamics
- Higher-order triangular spectral element method with optimized cubature points for seismic wavefield modeling
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A minimum action method for small random perturbations of two-dimensional parallel shear flows
- Instabilities in shear and simple shear deformations of gold crystals
- A new efficient conjugate gradient method for unconstrained optimization
- Variational formulation for Wannier functions with entangled band structure
- Numerical simulations of some nonlinear conjugate gradient methods with inexact line searches
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
- Iterated dynamic thresholding search for packing equal circles into a circular container
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A nonmonotone hybrid conjugate gradient method for unconstrained optimization
- A simple sufficient descent method for unconstrained optimization
- A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization
- A conjugate gradient method with sufficient descent property
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- A note on the spectral gradient projection method for nonlinear monotone equations with applications
- On the sufficient descent property of the Shanno's conjugate gradient method
- Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization
- Some sufficient descent conjugate gradient methods and their global convergence
- Nonconvex optimization using negative curvature within a modified linesearch
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
- A sufficient descent conjugate gradient method and its global convergence
- LMBOPT: a limited memory method for bound-constrained optimization
- Some three-term conjugate gradient methods with the new direction structure
- Convergence of the descent Dai-Yuan conjugate gradient method for unconstrained optimization
- Real-time pricing method for smart grid based on social welfare maximization model
- The new spectral conjugate gradient method for large-scale unconstrained optimisation
- A smoothing iterative method for the finite minimax problem
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems
- New three-term conjugate gradient method with guaranteed global convergence
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- An active set trust-region method for bound-constrained optimization
- A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
- A dynamic-solver-consistent minimum action method: with an application to 2D Navier-Stokes equations
- A modified spectral conjugate gradient method with global convergence
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Quantum optimal control using the adjoint method
- A robust extremum seeking scheme for dynamic systems with uncertainties and disturbances
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- Modeling and control through leadership of a refined flocking system
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A conjugate gradient method with descent direction for unconstrained optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Algorithm 851
- Applying powell's symmetrical technique to conjugate gradient methods
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Scaled conjugate gradient algorithms for unconstrained optimization
- Two modified scaled nonlinear conjugate gradient methods
- A modified Hestenes-Stiefel conjugate gradient algorithm for large-scale optimization
- A limited memory BFGS-type method for large-scale unconstrained optimization
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- A spectral dai-yuan-type conjugate gradient method for unconstrained optimization
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
Uses Software
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)