A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
From MaRDI portal
Publication:2346397
DOI10.1007/s10957-014-0528-4zbMath1319.49042OpenAlexW2031891766MaRDI QIDQ2346397
Publication date: 1 June 2015
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-014-0528-4
unconstrained optimizationglobal convergenceconjugate gradient methodquasi-Newton methodimproved Wolfe line search
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37)
Related Items
A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ A new family of hybrid three-term conjugate gradient methods with applications in image restoration ⋮ Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising ⋮ A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations ⋮ An inertial spectral CG projection method based on the memoryless BFGS update ⋮ Two families of hybrid conjugate gradient methods with restart procedures and their applications ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems ⋮ Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction ⋮ Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search ⋮ Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions ⋮ Unnamed Item ⋮ A spectral three-term Hestenes-Stiefel conjugate gradient method ⋮ An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation ⋮ A new conjugate gradient method with an efficient memory structure ⋮ Two improved nonlinear conjugate gradient methods with the strong Wolfe line search ⋮ A three-term conjugate gradient method with accelerated subspace quadratic optimization ⋮ Two classes of spectral conjugate gradient methods for unconstrained optimizations ⋮ A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Theoretical characteristics and numerical methods for a class of special piecewise quadratic optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Restart procedures for the conjugate gradient method
- On the Convergence of a New Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- CUTEr and SifDec
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization