A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
From MaRDI portal
Publication:668235
DOI10.1016/j.amc.2015.07.019zbMath1410.90203MaRDI QIDQ668235
Mohd Rivaie, Mustafa Mamat, Abdelrhaman Abashar
Publication date: 18 March 2019
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2015.07.019
90C30: Nonlinear programming
65K10: Numerical optimization and variational techniques
49M37: Numerical methods based on nonlinear programming
90C52: Methods of reduced gradient type
Related Items
A modified form of conjugate gradient method for unconstrained optimization problems, A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Unnamed Item, Some three-term conjugate gradient methods with the inexact line search condition, New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A new class of nonlinear conjugate gradient coefficients with global convergence properties
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- The convergence of conjugate gradient method with nonmonotone line search
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A conjugate gradient method with descent direction for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Efficient hybrid conjugate gradient techniques
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- New line search methods for unconstrained optimization
- A new family of conjugate gradient methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence Properties of Algorithms for Nonlinear Optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- On Steepest Descent
- Methods of conjugate gradients for solving linear systems
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Global convergence of conjugate gradient methods without line search