A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
DOI10.1137/030601880zbMATH Open1093.90085OpenAlexW2018215034MaRDI QIDQ5317554FDOQ5317554
Authors: William Hager, Hongchao Zhang
Publication date: 16 September 2005
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/45efe38ea906de376db50a86b8a11ef566821e42
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
convergencenonlinear programmingglobal convergenceunconstrained optimizationconjugate gradient methodline searchWolfe conditionsCUTE
Cited In (only showing first 100 items - show all)
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- Solving optimal control problem of monodomain model using hybrid conjugate gradient methods
- A note on robust descent in differentiable optimization
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- Conditional gradient method for vector optimization
- Sufficient descent Riemannian conjugate gradient methods
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- A family of quasi-Newton methods for unconstrained optimization problems
- An efficient conjugate gradient trust-region approach for systems of nonlinear equation
- An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization
- A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints
- Computing the \(p\)-spectral radii of uniform hypergraphs with applications
- A Fokker-Planck approach to control collective motion
- Linear convergence of descent methods for the unconstrained minimization of restricted strongly convex functions
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Flow search approach and new bounds for the \(m\)-step linear conjugate gradient algorithm
- Conjugate gradient methods using value of objective function for unconstrained optimization
- An improved three-term conjugate gradient algorithm for solving unconstrained optimization problems
- A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization
- Speeding up the convergence of the Polyak's heavy ball algorithm
- On the efficiency of gradient based optimization algorithms for DNS-based optimal control in a turbulent channel flow
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- Derivative-free method for bound constrained nonlinear monotone equations and its application in solving steady state reaction-diffusion problems
- An improved Perry conjugate gradient method with adaptive parameter choice
- A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- A derivative-free iterative method for nonlinear monotone equations with convex constraints
- Second-order approximation and fast multigrid solution of parabolic bilinear optimization problems
- A class of one parameter conjugate gradient methods
- A truncated descent HS conjugate gradient method and its global convergence
- Riemannian Multigrid Line Search for Low-Rank Problems
- Using improved directions of negative curvature for the solution of bound-constrained nonconvex problems
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Two nonmonotone trust region algorithms based on an improved Newton method
- A hybrid of DL and WYL nonlinear conjugate gradient methods
- A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
- On the optimal control of a random walk with jumps and barriers
- Title not available (Why is that?)
- An efficient three-term conjugate gradient method for nonlinear monotone equations with convex constraints
- A joint optimization of momentum item and Levenberg-Marquardt algorithm to level up the BPNN's generalization ability
- Two modified three-term type conjugate gradient methods and their global convergence for unconstrained optimization
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- A Fokker–Planck Feedback Control-Constrained Approach for Modelling Crowd Motion
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A class of accelerated subspace minimization conjugate gradient methods
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing ℓ1regularized problem
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- A new hybrid conjugate gradient method of unconstrained optimization methods
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations
- A conjugate gradient algorithm and its applications in image restoration
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- A modified Polak–Ribi‘ere–Polyak descent method for unconstrained optimization
- A derivative-free three-term Hestenes–Stiefel type method for constrained nonlinear equations and image restoration
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem
- A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- A modified sufficient descent Polak-Ribiére-Polyak type conjugate gradient method for unconstrained optimization problems
- New hybrid conjugate gradient method as a convex combination of LS and FR methods
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A new family of conjugate gradient methods for unconstrained optimization
- A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- Unconstrained direct optimization of spacecraft trajectories using many embedded Lambert problems
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- Behavior of the combination of PRP and HZ methods for unconstrained optimization
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- Descent three-term DY-type conjugate gradient methods for constrained monotone equations with application
- Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
- Least-squares-based three-term conjugate gradient methods
- On large-scale unconstrained optimization and arbitrary regularization
- A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection
- Nonlinearly preconditioned optimization on Grassmann manifolds for computing approximate Tucker tensor decompositions
- A scaled three-term conjugate gradient method for unconstrained optimization
Uses Software
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)