A new class of nonlinear conjugate gradient coefficients with global convergence properties
From MaRDI portal
Publication:387529
DOI10.1016/j.amc.2012.05.030zbMath1278.65094OpenAlexW2058309945MaRDI QIDQ387529
Leong Wah June, Mohd Rivaie, Mustafa Mamat, Ismail Bin Mohd
Publication date: 23 December 2013
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2012.05.030
Related Items (24)
An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search ⋮ A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis ⋮ Unnamed Item ⋮ A fast inertial self-adaptive projection based algorithm for solving large-scale nonlinear monotone equations ⋮ Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing ⋮ Hybrid random batch idea and nonlinear conjugate gradient method for accelerating charged polymer dynamics simulation ⋮ Global convergence properties of the BBB conjugate gradient method ⋮ A three-term projection method based on spectral secant equation for nonlinear monotone equations ⋮ Nonstationary fuzzy neural network based on fcmnet clustering and a modified CG method with Armijo-type rule ⋮ Unnamed Item ⋮ A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches ⋮ Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties ⋮ Some three-term conjugate gradient methods with the inexact line search condition ⋮ A derivative-free \textit{RMIL} conjugate gradient projection method for convex constrained nonlinear monotone equations with applications in compressive sensing ⋮ The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search ⋮ A hybrid conjugate finite-step length method for robust and efficient reliability analysis ⋮ Modified three-term conjugate gradient method and its applications ⋮ A smoothing iterative method for the finite minimax problem ⋮ A class of new derivative-free gradient type methods for large-scale nonlinear systems of monotone equations ⋮ A new steepest descent method with global convergence properties ⋮ A new classical conjugate gradient coefficient with exact line search ⋮ An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems ⋮ A NEW DERIVATIVE-FREE CONJUGATE GRADIENT METHOD FOR LARGE-SCALE NONLINEAR SYSTEMS OF EQUATIONS ⋮ A new class of nonlinear conjugate gradient coefficients for unconstrained optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- The convergence of conjugate gradient method with nonmonotone line search
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A conjugate gradient method with descent direction for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Conjugate gradient algorithms in nonconvex optimization
- Efficient hybrid conjugate gradient techniques
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- New line search methods for unconstrained optimization
- A new family of conjugate gradient methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence Properties of Algorithms for Nonlinear Optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- On Steepest Descent
- Methods of conjugate gradients for solving linear systems
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Global convergence of conjugate gradient methods without line search
This page was built for publication: A new class of nonlinear conjugate gradient coefficients with global convergence properties