A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
From MaRDI portal
Publication:5300528
DOI10.1137/100813026zbMATH Open1266.49065OpenAlexW2075313995MaRDI QIDQ5300528FDOQ5300528
Authors: CaiXia Kou, Yuhong Dai
Publication date: 27 June 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100813026
Recommendations
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Nonlinear conjugate gradient methods with Wolfe type line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A new globally convergent conjugate gradient method with Wolfe line search
- Global convergence of a conjugate gradient method with strong Wolfe-Powell line search
global convergenceunconstrained optimizationconjugate gradient methodWolfe line searchmemoryless BFGS method
Cited In (only showing first 100 items - show all)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- Theoretical characteristics and numerical methods for a class of special piecewise quadratic optimization
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A class of accelerated subspace minimization conjugate gradient methods
- A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- A global convergence of LS-CD hybrid conjugate gradient method
- Matrix analyses on the Dai-Liao conjugate gradient method
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- A descent Dai-Liao projection method for convex constrained nonlinear monotone equations with applications
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- An efficient adaptive three-term extension of the Hestenes-Stiefel conjugate gradient method
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A new family of conjugate gradient methods for unconstrained optimization
- A class of globally convergent three-term Dai-Liao conjugate gradient methods
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations
- A modified nonmonotone hestenes-Stiefel type conjugate gradient methods for large-scale unconstrained problems
- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
- New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
- A survey of gradient methods for solving nonlinear optimization
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- A three-term conjugate gradient method with accelerated subspace quadratic optimization
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- A new conjugate gradient method with an efficient memory structure
- A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- An efficient inertial subspace minimization CG algorithm with convergence rate analysis for constrained nonlinear monotone equations
- An accelerated relaxed-inertial strategy based CGP algorithm with restart technique for constrained nonlinear pseudo-monotone equations to image de-blurring problems
- A mini-batch stochastic conjugate gradient algorithm with variance reduction
- A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems
- A new subspace minimization conjugate gradient method for unconstrained minimization
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- A family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restoration
- A conjugate gradient algorithm without Lipchitz continuity and its applications
- On two symmetric Dai-Kou type schemes for constrained monotone equations with image recovery application
- Nonlinear conjugate gradient for smooth convex functions
- Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- A family of spectral conjugate gradient methods with strong convergence and its applications in image restoration and machine learning
- An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems
- An inertial spectral CG projection method based on the memoryless BFGS update
- A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications
- A new approximate descent derivative-free algorithm for large-scale nonlinear symmetric equations
- An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients
- New gradient methods with adaptive stepsizes by approximate models
- An improved PRP type spectral conjugate gradient method with restart steps
- A new structured spectral conjugate gradient method for nonlinear least squares problems
- A globally convergent hybrid conjugate gradient method and its numerical behaviors
- An overview of nonlinear optimization
- Two families of hybrid conjugate gradient methods with restart procedures and their applications
- An extended version of the memoryless DFP algorithm with the sufficient descent property
- Delayed gradient methods for symmetric and positive definite linear systems
- A modulus-based nonmonotone line search method for nonlinear complementarity problems
- A truncated three-term conjugate gradient method with complexity guarantees with applications to nonconvex regression problem
- A modified four-term extension of the Dai-Liao conjugate gradient method
- On the extension of Dai-Liao conjugate gradient method for vector optimization
- A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization
- Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization
- A new conjugate gradient algorithm with cubic Barzilai-Borwein stepsize for unconstrained optimization
- A new version of augmented self-scaling BFGS method
- Globally linearly convergent nonlinear conjugate gradients without Wolfe line search
- Global convergence of three-term conjugate gradient methods on general functions under a new inexact line search strategy
- Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization
- Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring
- A new three-term conjugate gradient method with descent direction for unconstrained optimization
- A family of limited memory three term conjugate gradient methods
- Two efficient spectral hybrid CG methods based on memoryless BFGS direction and Dai–Liao conjugacy condition
- Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization
Uses Software
This page was built for publication: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5300528)