A class of accelerated conjugate-gradient-like methods based on a modified secant equation
DOI10.3934/JIMO.2019013zbMATH Open1449.90260OpenAlexW2920996151WikidataQ128151853 ScholiaQ128151853MaRDI QIDQ2190281FDOQ2190281
Authors: Haichan Lin, Yigui Ou
Publication date: 18 June 2020
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2019013
Recommendations
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A modified conjugate gradient method based on a modified secant equation
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A modified Perry conjugate gradient method and its global convergence
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
unconstrained optimizationconvergence analysismodified secant equationaccelerated schemeself-scaling memoryless BFGS update
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- CUTE
- Numerical Optimization
- Benchmarking optimization software with performance profiles.
- Optimization theory and methods. Nonlinear programming
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A Nonmonotone Line Search Technique for Newton’s Method
- A survey of nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Incorporating nonmonotone strategies into the trust region method for unconstrained optimization
- On the nonmonotone line search
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An improved Perry conjugate gradient method with adaptive parameter choice
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
Cited In (2)
Uses Software
This page was built for publication: A class of accelerated conjugate-gradient-like methods based on a modified secant equation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2190281)