A modified scaled conjugate gradient method with global convergence for nonconvex functions
zbMATH Open1305.90379MaRDI QIDQ464217FDOQ464217
Authors: Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 17 October 2014
Published in: Bulletin of the Belgian Mathematical Society - Simon Stevin (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.bbms/1407765884
Recommendations
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- Global convergence of a modified limited memory BFGS method for non-convex minimization
- A modified scaling BFGS method for nonconvex minimization
- A globally convergent BFGS method with nonmonotone line search for non-convex minimization
- Two modified scaled nonlinear conjugate gradient methods
descent conditionglobal convergenceunconstrained optimizationconjugate gradient algorithmsecant equation
Numerical mathematical programming methods (65K05) Eigenvalues, singular values, and eigenvectors (15A18) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Methods of quasi-Newton type (90C53)
Cited In (18)
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- A modified scaling parameter for the memoryless BFGS updating formula
- An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems
- A restart scheme for the memoryless BFGS method
- An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method
- Modified Newton integration algorithm with noise suppression for online dynamic nonlinear optimization
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
This page was built for publication: A modified scaled conjugate gradient method with global convergence for nonconvex functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q464217)