A class of accelerated conjugate-gradient-like methods based on a modified secant equation
From MaRDI portal
Publication:2190281
Recommendations
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A modified conjugate gradient method based on a modified secant equation
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A modified Perry conjugate gradient method and its global convergence
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
Cites work
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- A survey of nonlinear conjugate gradient methods
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- An improved Perry conjugate gradient method with adaptive parameter choice
- Benchmarking optimization software with performance profiles.
- CUTE
- Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search
- Globally convergent modified Perry's conjugate gradient method
- Incorporating nonmonotone strategies into the trust region method for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- New quasi-Newton equation and related methods for unconstrained optimization
- Numerical Optimization
- On the nonmonotone line search
- Optimization theory and methods. Nonlinear programming
- Scaled conjugate gradient algorithms for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
Cited in
(2)
This page was built for publication: A class of accelerated conjugate-gradient-like methods based on a modified secant equation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2190281)