Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
From MaRDI portal
Publication:2190800
DOI10.1007/s10288-019-00412-2zbMath1445.90085OpenAlexW2960088919WikidataQ127469460 ScholiaQ127469460MaRDI QIDQ2190800
Publication date: 22 June 2020
Published in: 4OR (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10288-019-00412-2
Related Items (3)
Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ A globally convergent BFGS method for symmetric nonlinear equations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scaling damped limited-memory updates for unconstrained optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Global convergence of a modified limited memory BFGS method for non-convex minimization
- New cautious BFGS algorithm based on modified Armijo-type line search
- A combined class of self-scaling and modified quasi-Newton methods
- A nonmonotone filter line search technique for the MBFGS method in unconstrained optimization
- A class of diagonal quasi-Newton methods for large-scale convex minimization
- New quasi-Newton methods via higher order tensor models
- Convergence analysis of a modified BFGS method on convex minimizations
- Global convergence properties of two modified BFGS-type methods
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- A new structured quasi-Newton algorithm using partial information on Hessian
- A limited memory BFGS-type method for large-scale unconstrained optimization
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- New quasi-Newton equation and related methods for unconstrained optimization
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- The global convergence of a modified BFGS method for nonconvex functions
- New quasi-Newton methods for unconstrained optimization problems
- A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION
- A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION
- Convergence Properties of the BFGS Algoritm
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- New BFGS method for unconstrained optimization problem based on modified Armijo line search
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions