Two new conjugate gradient methods based on modified secant equations
From MaRDI portal
Publication:972741
DOI10.1016/j.cam.2010.01.052zbMath1202.65071OpenAlexW2141102526WikidataQ57952896 ScholiaQ57952896MaRDI QIDQ972741
Nezam Mahdavi-Amiri, Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 21 May 2010
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2010.01.052
unconstrained optimizationglobal convergenceconjugate gradient methodconjugacy conditionmodified secant equation
Related Items (47)
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family ⋮ A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations ⋮ A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration ⋮ A class of one parameter conjugate gradient methods ⋮ A new smoothing spectral conjugate gradient method for solving tensor complementarity problems ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations ⋮ A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence ⋮ A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ A modified scaled memoryless symmetric rank-one method ⋮ A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model ⋮ On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ A new conjugate gradient algorithm for training neural networks based on a modified secant equation ⋮ Two-step conjugate gradient method for unconstrained optimization ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ An improved nonlinear conjugate gradient method with an optimal property ⋮ Two modified three-term conjugate gradient methods with sufficient descent property ⋮ Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations ⋮ A limited memory descent Perry conjugate gradient method ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ A new accelerated conjugate gradient method for large-scale unconstrained optimization ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter ⋮ Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations ⋮ A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Modified two-point stepsize gradient methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Using function-values in multi-step quasi-Newton methods
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- Unified approach to quadratically convergent algorithms for function minimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Updating conjugate directions by the BFGS formula
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- Numerical Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Two new conjugate gradient methods based on modified secant equations