Adaptive scaling damped BFGS method without gradient Lipschitz continuity
From MaRDI portal
Publication:2060902
DOI10.1016/j.aml.2021.107634zbMath1482.90175OpenAlexW3197176902MaRDI QIDQ2060902
Yingjie Zhou, Mengxiang Zhang, Gong Lin Yuan
Publication date: 13 December 2021
Published in: Applied Mathematics Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.aml.2021.107634
Related Items (6)
Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ Solving nonlinear equations with a direct Broyden method and its acceleration ⋮ Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization ⋮ An adaptive projection BFGS method for nonconvex unconstrained optimization problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- An adaptive scaled BFGS method for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The global convergence of a modified BFGS method for nonconvex functions
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Quasi-Newton Methods, Motivation and Theory
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Adaptive scaling damped BFGS method without gradient Lipschitz continuity