A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
From MaRDI portal
Publication:2441364
DOI10.1007/s10288-013-0233-4zbMath1292.65061OpenAlexW2078029965MaRDI QIDQ2441364
Publication date: 24 March 2014
Published in: 4OR (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10288-013-0233-4
unconstrained optimizationglobal convergencesufficient descent conditionscaled conjugate gradient methodmodified secant equation
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (17)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ A scaled three-term conjugate gradient method for unconstrained optimization ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ A spectral conjugate gradient method for solving large-scale unconstrained optimization ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization ⋮ Unnamed Item ⋮ Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ Two classes of spectral conjugate gradient methods for unconstrained optimizations ⋮ A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- On restart procedures for the conjugate gradient method
- A modified BFGS algorithm based on a hybrid secant equation
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- A structured secant method based on a new quasi-Newton equation for nonlinear least squares problems
- Modified two-point stepsize gradient methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimal conditioning of self-scaling variable Metric algorithms
- Conjugate Gradient Methods with Inexact Searches
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- CUTEr and SifDec
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization