A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
From MaRDI portal
Publication:5300528
DOI10.1137/100813026zbMATH Open1266.49065OpenAlexW2075313995MaRDI QIDQ5300528FDOQ5300528
Authors: CaiXia Kou, Yuhong Dai
Publication date: 27 June 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100813026
Recommendations
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Nonlinear conjugate gradient methods with Wolfe type line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A new globally convergent conjugate gradient method with Wolfe line search
- Global convergence of a conjugate gradient method with strong Wolfe-Powell line search
global convergenceunconstrained optimizationconjugate gradient methodWolfe line searchmemoryless BFGS method
Cited In (only showing first 100 items - show all)
- A modified conjugate gradient method for general convex functions
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A modified conjugate gradient method based on a modified secant equation
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- Title not available (Why is that?)
- Total variation superiorized conjugate gradient method for image reconstruction
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A modified scaling parameter for the memoryless BFGS updating formula
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- A new efficient conjugate gradient method for unconstrained optimization
- A descent family of Dai-Liao conjugate gradient methods
- Spectral conjugate gradient methods for vector optimization problems
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
- An improved Perry conjugate gradient method with adaptive parameter choice
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A conjugate gradient method with sufficient descent property
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems
- Two optimal Dai–Liao conjugate gradient methods
- A descent Dai-Liao conjugate gradient method for nonlinear equations
- A class of one parameter conjugate gradient methods
- A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization
- A modified hybrid conjugate gradient method for unconstrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- An improved nonlinear conjugate gradient method with an optimal property
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- A sufficient descent conjugate gradient method and its global convergence
- A Barzilai-Borwein conjugate gradient method
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- Barzilai-Borwein-like methods for the extreme eigenvalue problem
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Norm descent conjugate gradient methods for solving symmetric nonlinear equations
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
- Title not available (Why is that?)
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- A norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- An approximate gradient-type method for nonlinear symmetric equations with convex constraints
- Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- A modified spectral conjugate gradient method with global convergence
- An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- Title not available (Why is that?)
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- Theoretical characteristics and numerical methods for a class of special piecewise quadratic optimization
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A class of accelerated subspace minimization conjugate gradient methods
- A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- A global convergence of LS-CD hybrid conjugate gradient method
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- A spectral three-term Hestenes-Stiefel conjugate gradient method
Uses Software
This page was built for publication: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5300528)