A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
From MaRDI portal
Publication:5300528
DOI10.1137/100813026zbMATH Open1266.49065OpenAlexW2075313995MaRDI QIDQ5300528FDOQ5300528
Authors: CaiXia Kou, Yuhong Dai
Publication date: 27 June 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100813026
Recommendations
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Nonlinear conjugate gradient methods with Wolfe type line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A new globally convergent conjugate gradient method with Wolfe line search
- Global convergence of a conjugate gradient method with strong Wolfe-Powell line search
global convergenceunconstrained optimizationconjugate gradient methodWolfe line searchmemoryless BFGS method
Cited In (only showing first 100 items - show all)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- Theoretical characteristics and numerical methods for a class of special piecewise quadratic optimization
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A class of accelerated subspace minimization conjugate gradient methods
- A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- A global convergence of LS-CD hybrid conjugate gradient method
- Matrix analyses on the Dai-Liao conjugate gradient method
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- A descent Dai-Liao projection method for convex constrained nonlinear monotone equations with applications
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- An efficient adaptive three-term extension of the Hestenes-Stiefel conjugate gradient method
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A new family of conjugate gradient methods for unconstrained optimization
- A class of globally convergent three-term Dai-Liao conjugate gradient methods
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations
- A modified nonmonotone hestenes-Stiefel type conjugate gradient methods for large-scale unconstrained problems
- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
- New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
- A survey of gradient methods for solving nonlinear optimization
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- A three-term conjugate gradient method with accelerated subspace quadratic optimization
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- A modified conjugate gradient method for general convex functions
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A modified spectral conjugate gradient method for solving unconstrained minimization problems
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A modified conjugate gradient method based on a modified secant equation
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- Two optimal Dai-Liao conjugate gradient methods
- Title not available (Why is that?)
- Total variation superiorized conjugate gradient method for image reconstruction
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A modified scaling parameter for the memoryless BFGS updating formula
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- A new efficient conjugate gradient method for unconstrained optimization
- A descent family of Dai-Liao conjugate gradient methods
- A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family
- Spectral conjugate gradient methods for vector optimization problems
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
- An improved Perry conjugate gradient method with adaptive parameter choice
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A conjugate gradient method with sufficient descent property
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems
- A descent Dai-Liao conjugate gradient method for nonlinear equations
- A class of one parameter conjugate gradient methods
- A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization
- A modified hybrid conjugate gradient method for unconstrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- An improved nonlinear conjugate gradient method with an optimal property
- A sufficient descent conjugate gradient method and its global convergence
Uses Software
This page was built for publication: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5300528)