A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations
From MaRDI portal
Publication:4641557
DOI10.1080/00207160.2017.1290433zbMath1390.65043OpenAlexW2584008117MaRDI QIDQ4641557
Xiabin Duan, Xiangrong Li, Zhou Sheng, Xiaoliang Wang
Publication date: 17 May 2018
Published in: International Journal of Computer Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00207160.2017.1290433
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items
A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ A class of three-term derivative-free methods for large-scale nonlinear monotone system of equations and applications to image restoration problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
- A BFGS trust-region method for nonlinear equations
- Limited memory BFGS method with backtracking for symmetric nonlinear equations
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- A note about WYL's conjugate gradient method and its applications
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A new backtracking inexact BFGS method for symmetric nonlinear equations
- Convergence of line search methods for unconstrained optimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Analysis of two Chebyshev-like third order methods free from second derivatives for solving systems of nonlinear equations
- A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization
- An active-set projected trust-region algorithm with limited memory BFGS technique for box-constrained nonsmooth equations
- A New Method with Descent Property for Symmetric Nonlinear Equations
- A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems
- A new trust-region method with line search for solving symmetric nonlinear equations
- Algorithm 851
- Solution of the Chandrasekhar H-equation by Newton’s Method
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Theory of Nonlinear Newton–Krylov Algorithms
- Newton-type Methods with Generalized Distances For Constrained Optimization
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- Descent Directions of Quasi-Newton Methods for Symmetric Nonlinear Equations
- A BFGS algorithm for solving symmetric nonlinear equations
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.