A modified nonmonotone BFGS algorithm for unconstrained optimization
From MaRDI portal
Publication:2400759
DOI10.1186/s13660-017-1453-5zbMath1372.65179OpenAlexW2742802560WikidataQ41332719 ScholiaQ41332719MaRDI QIDQ2400759
Xiangrong Li, Wujie Hu, Bopeng Wang
Publication date: 30 August 2017
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-017-1453-5
unconstrained optimizationglobal convergencesuperlinear convergencenonmonotone line searchnumerical resultBFGS updateBroyden-Fletcher-Goldfarb-Shanno algorithm
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items
Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model ⋮ New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A BFGS trust-region method for nonlinear equations
- Limited memory BFGS method with backtracking for symmetric nonlinear equations
- Convergence analysis of a modified BFGS method on convex minimizations
- A conjugate gradient method with descent direction for unconstrained optimization
- Differential optimization techniques
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A truncated Newton method with non-monotone line search for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- A note on minimization problems and multistep methods
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A class on nonmonotone stabilization methods in unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- New quasi-Newton methods for unconstrained optimization problems
- New line search methods for unconstrained optimization
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems
- A new trust-region method with line search for solving symmetric nonlinear equations
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Variable Metric Method for Minimization
- Quasi-Newton Methods, Motivation and Theory
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- A BFGS algorithm for solving symmetric nonlinear equations
- One-step and multistep procedures for constrained minimization problems
- CUTEr and SifDec
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.