A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
DOI10.1016/j.apnum.2023.05.024zbMath1528.65032OpenAlexW4378977857MaRDI QIDQ6064895
Publication date: 10 November 2023
Published in: Applied Numerical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.apnum.2023.05.024
conjugacy conditionimage restorationsufficient descent propertyacceleration strategyspectral three-term conjugate gradient method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A modified scaled conjugate gradient method with global convergence for nonconvex functions
- New quasi-Newton methods via higher order tensor models
- Convergence analysis of a modified BFGS method on convex minimizations
- A modified quasi-Newton method for structured optimization with partial information on the Hessian
- A descent spectral conjugate gradient method for impulse noise removal
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- A limited memory BFGS method for solving large-scale symmetric nonlinear equations
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- The projection technique for two open problems of unconstrained optimization problems
- Convergence of a relaxed inertial proximal algorithm for maximally monotone operators
- A class of globally convergent three-term Dai-Liao conjugate gradient methods
- A nonsmooth version of Newton's method
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems