Two new conjugate gradient methods based on modified secant equations
DOI10.1016/J.CAM.2010.01.052zbMATH Open1202.65071OpenAlexW2141102526WikidataQ57952896 ScholiaQ57952896MaRDI QIDQ972741FDOQ972741
Nezam Mahdavi-Amiri, Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 21 May 2010
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2010.01.052
conjugacy conditionglobal convergenceunconstrained optimizationconjugate gradient methodmodified secant equation
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Testing Unconstrained Optimization Software
- Numerical Optimization
- Function minimization by conjugate gradients
- Optimization theory and methods. Nonlinear programming
- Technical Note—A Modified Conjugate Gradient Algorithm
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- New quasi-Newton methods for unconstrained optimization problems
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A modified BFGS method and its global convergence in nonconvex minimization
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence properties of the Fletcher-Reeves method
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A Modified BFGS Algorithm for Unconstrained Optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Conjugate Gradient Methods with Inexact Searches
- Modified two-point stepsize gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Multi-step quasi-Newton methods for optimization
- Using function-values in multi-step quasi-Newton methods
- Unified approach to quadratically convergent algorithms for function minimization
- Updating conjugate directions by the BFGS formula
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
Cited In (48)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
- A descent family of Dai–Liao conjugate gradient methods
- On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- A modified scaling parameter for the memoryless BFGS updating formula
- An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Two modified scaled nonlinear conjugate gradient methods
- A modified Perry conjugate gradient method and its global convergence
- A new smoothing spectral conjugate gradient method for solving tensor complementarity problems
- Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint
- A modified BFGS algorithm based on a hybrid secant equation
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter
- A class of one parameter conjugate gradient methods
- Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
- An improved nonlinear conjugate gradient method with an optimal property
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration
- Two modified three-term conjugate gradient methods with sufficient descent property
- A modified scaled memoryless symmetric rank-one method
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations
- A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Descent Perry conjugate gradient methods for systems of monotone nonlinear equations
- A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
- A limited memory descent Perry conjugate gradient method
- A new hybrid three-term LS-CD conjugate gradient in solving unconstrained optimization problems
- Two-step conjugate gradient method for unconstrained optimization
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations
Uses Software
This page was built for publication: Two new conjugate gradient methods based on modified secant equations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q972741)