A new class of nonlinear conjugate gradient coefficients with global convergence properties
From MaRDI portal
Publication:387529
DOI10.1016/J.AMC.2012.05.030zbMATH Open1278.65094OpenAlexW2058309945MaRDI QIDQ387529FDOQ387529
Authors: Mohd Rivaie, Mustafa Mamat, Leong Wah June, Ismail Mohd
Publication date: 23 December 2013
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2012.05.030
Recommendations
- A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- A new modification of nonlinear conjugate gradient formula
- A new class of nonlinear conjugate gradient coefficients for unconstrained optimization
- A class of nonlinear conjugate gradient methods with a global convergence property
Cites Work
- Benchmarking optimization software with performance profiles.
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- Title not available (Why is that?)
- An unconstrained optimization test functions collection
- Title not available (Why is that?)
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Efficient hybrid conjugate gradient techniques
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- A conjugate gradient method with descent direction for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Title not available (Why is that?)
- On Steepest Descent
- The convergence of conjugate gradient method with nonmonotone line search
- A spectral conjugate gradient method for unconstrained optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- New line search methods for unconstrained optimization
- A new family of conjugate gradient methods
- Title not available (Why is that?)
- Global convergence of conjugate gradient methods without line search
- Conjugate gradient algorithms in nonconvex optimization
Cited In (36)
- Title not available (Why is that?)
- Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- A hybrid conjugate finite-step length method for robust and efficient reliability analysis
- The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search
- Another modified version of RMIL conjugate gradient method
- An effective inertial-relaxed CGPM for nonlinear monotone equations
- Hybrid random batch idea and nonlinear conjugate gradient method for accelerating charged polymer dynamics simulation
- A class of new derivative-free gradient type methods for large-scale nonlinear systems of monotone equations
- A new steepest descent method with global convergence properties
- Global convergence properties of the BBB conjugate gradient method
- A new class of nonlinear conjugate gradient method for unconstrained optimization models and its application in portfolio selection
- A modified PRP conjugate gradient method for unconstrained optimization and nonlinear equations
- Some three-term conjugate gradient methods with the inexact line search condition
- Modified three-term conjugate gradient method and its applications
- A derivative-free \textit{RMIL} conjugate gradient projection method for convex constrained nonlinear monotone equations with applications in compressive sensing
- A three-term projection method based on spectral secant equation for nonlinear monotone equations
- An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search
- A new nonlinear conjugate gradient method with guaranteed global convergence
- Two efficient nonlinear conjugate gradient methods for Riemannian manifolds
- A new derivative-free conjugate gradient method for large-scale nonlinear systems of equations
- Nonstationary fuzzy neural network based on fcmnet clustering and a modified CG method with Armijo-type rule
- A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis
- A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization
- New hybrid conjugate gradient method for nonlinear optimization with application to image restoration problems.
- A convex combination of improved Fletcher-Reeves and Rivaie-Mustafa-Ismail-Leong conjugate gradient methods for unconstrained optimization problems and applications
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Derivative-free RMIL conjugate gradient method for convex constrained equations
- A smoothing iterative method for the finite minimax problem
- A new classical conjugate gradient coefficient with exact line search
- A hybrid CG algorithm for nonlinear unconstrained optimization with application in image restoration
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties
- A new class of nonlinear conjugate gradient coefficients for unconstrained optimization
- A fast inertial self-adaptive projection based algorithm for solving large-scale nonlinear monotone equations
- The proof of sufficient descent condition for a new type of conjugate gradient methods
This page was built for publication: A new class of nonlinear conjugate gradient coefficients with global convergence properties
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q387529)