A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
From MaRDI portal
Publication:3514834
DOI10.1080/02331930601127909zbMath1338.90457OpenAlexW2051800432MaRDI QIDQ3514834
Publication date: 23 July 2008
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331930601127909
unconstrained optimizationconjugate gradient methodnumerical comparisonsspectral gradient methodBFGS formula
Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Methods of reduced gradient type (90C52)
Related Items (8)
A modified scaling parameter for the memoryless BFGS updating formula ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem ⋮ Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
Uses Software
Cites Work
- Convergence properties of the Beale-Powell restart algorithm
- Modified two-point stepsize gradient methods for unconstrained optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimal conditioning of self-scaling variable Metric algorithms
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A scaled nonlinear conjugate gradient algorithm for unconstrained optimization