Two modified scaled nonlinear conjugate gradient methods
From MaRDI portal
Publication:390466
DOI10.1016/j.cam.2013.11.001zbMath1278.65086OpenAlexW2062157166MaRDI QIDQ390466
Publication date: 8 January 2014
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2013.11.001
unconstrained optimizationglobal convergenceBFGS updatesufficient descent conditionCUTErmodified secant equationscaled nonlinear conjugate gradient method
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items
A modified scaling parameter for the memoryless BFGS updating formula ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ A new spectral conjugate gradient method for large-scale unconstrained optimization ⋮ Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization ⋮ A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- On restart procedures for the conjugate gradient method
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- A modified BFGS algorithm based on a hybrid secant equation
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- Modified two-point stepsize gradient methods for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- A Relationship between the BFGS and Conjugate Gradient Algorithms and Its Implications for New Algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimal conditioning of self-scaling variable Metric algorithms
- Matrix conditioning and nonlinear optimization
- Extending the relationship between the conjugate gradient and BFGS algorithms
- Conjugate Gradient Methods with Inexact Searches
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- CUTEr and SifDec
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Conditioning of Quasi-Newton Methods for Function Minimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.