A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
DOI10.1137/030601880zbMATH Open1093.90085OpenAlexW2018215034MaRDI QIDQ5317554FDOQ5317554
Authors: William Hager, Hongchao Zhang
Publication date: 16 September 2005
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/45efe38ea906de376db50a86b8a11ef566821e42
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article; zbMATH DE number 179266
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
convergencenonlinear programmingglobal convergenceunconstrained optimizationconjugate gradient methodline searchWolfe conditionsCUTE
Cited In (only showing first 100 items - show all)
- A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations
- A Regularization Approach for an Inverse Source Problem in Elliptic Systems from Single Cauchy Data
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- Proximal methods for nonlinear programming: Double regularization and inexact subproblems
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- The Variational Gaussian Approximation Revisited
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Global convergence of a nonlinear conjugate gradient method
- A trust region algorithm with conjugate gradient technique for optimization problems
- Two modified DY conjugate gradient methods for unconstrained optimization problems
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- A modified conjugate gradient method based on a modified secant equation
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- Two optimal Dai-Liao conjugate gradient methods
- An adaptive high-order minimum action method
- Using approximate secant equations in limited memory methods for multilevel unconstrained optimization
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
- Two modified Polak-Ribière-Polyak-type nonlinear conjugate methods with sufficient descent property
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- A modified Perry conjugate gradient method and its global convergence
- On efficiently combining limited-memory and trust-region techniques
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A projection method for convex constrained monotone nonlinear equations with applications
- A descent family of Dai-Liao conjugate gradient methods
- A new family of conjugate gradient methods
- Modified Dai-Yuan conjugate gradient method with sufficient descent property for nonlinear equations
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- Sufficient descent conjugate gradient methods for large-scale optimization problems
- Improving directions of negative curvature in an efficient manner
- Design of optimal PID controller with \(\epsilon\)-Routh stability for different processes
- An improved nonlinear conjugate gradient method with an optimal property
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- Self-adaptive inexact proximal point methods
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Norm descent conjugate gradient methods for solving symmetric nonlinear equations
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization
- A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations
- Extension of modified Polak-Ribière-Polyak conjugate gradient method to linear equality constraints minimization problems
- A practical relative error criterion for augmented Lagrangians
- Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
- Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search
- A limited memory descent Perry conjugate gradient method
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- Supermemory gradient methods for monotone nonlinear equations with convex constraints
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- A modified CG-DESCENT method for unconstrained optimization
- A fast inertial self-adaptive projection based algorithm for solving large-scale nonlinear monotone equations
- Convergence and stability of line search methods for unconstrained optimization
- A new class of nonlinear conjugate gradient coefficients with global convergence properties
- The convergence rate of a restart MFR conjugate gradient method with inexact line search
- A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice
- A globally convergent derivative-free method for solving large-scale nonlinear monotone equations
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- A nonmonotone supermemory gradient algorithm for unconstrained optimization
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- A modified three-term PRP conjugate gradient algorithm for optimization models
- A modified conjugate gradient method for general convex functions
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem
- On a conjugate directions method for solving strictly convex QP problem
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A sufficient descent Liu–Storey conjugate gradient method and its global convergence
- Combining and scaling descent and negative curvature directions
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- An Liu-Storey-type method for solving large-scale nonlinear monotone equations
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Application of optimal control to the cardiac defibrillation problem using a physiological model of cellular dynamics
- Higher-order triangular spectral element method with optimized cubature points for seismic wavefield modeling
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A minimum action method for small random perturbations of two-dimensional parallel shear flows
- Instabilities in shear and simple shear deformations of gold crystals
- A new efficient conjugate gradient method for unconstrained optimization
- Variational formulation for Wannier functions with entangled band structure
- Numerical simulations of some nonlinear conjugate gradient methods with inexact line searches
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
- Iterated dynamic thresholding search for packing equal circles into a circular container
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A nonmonotone hybrid conjugate gradient method for unconstrained optimization
- A simple sufficient descent method for unconstrained optimization
- A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization
Uses Software
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)