A limited memory BFGS-type method for large-scale unconstrained optimization
From MaRDI portal
Publication:1004767
DOI10.1016/j.camwa.2008.01.028zbMath1155.90441OpenAlexW1968874483MaRDI QIDQ1004767
Yun-hai Xiao, Zeng-xin Wei, Zhi Guo Wang
Publication date: 12 March 2009
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2008.01.028
Related Items (17)
Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems ⋮ A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ A new adaptive trust region algorithm for optimization problems ⋮ A hybrid iterated local search algorithm with adaptive perturbation mechanism by success-history based parameter adaptation for differential evolution (SHADE) ⋮ Global convergence of a modified limited memory BFGS method for non-convex minimization ⋮ A simple sufficient descent method for unconstrained optimization ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ A new method with sufficient descent property for unconstrained optimization ⋮ A family of quasi-Newton methods for unconstrained optimization problems ⋮ A regularized limited memory BFGS method for nonconvex unconstrained minimization ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ Notes on the Dai-Yuan-Yuan modified spectral gradient method ⋮ A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On the limited memory BFGS method for large scale optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- The BFGS method with exact line searches fails for non-convex objective functions
- A survey of truncated-Newton methods
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- New quasi-Newton methods for unconstrained optimization problems
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Truncated-Newton algorithms for large-scale unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Numerical methods for large-scale nonlinear optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A limited memory BFGS-type method for large-scale unconstrained optimization