An adaptive projection BFGS method for nonconvex unconstrained optimization problems
From MaRDI portal
Publication:6202792
DOI10.1007/s11075-023-01626-6OpenAlexW4385606994MaRDI QIDQ6202792
Xiaoxuan Chen, Xiong Zhao, Gong Lin Yuan, Kejun Liu
Publication date: 26 March 2024
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-023-01626-6
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An adaptive scaled BFGS method for unconstrained optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Local convergence analysis for partitioned quasi-Newton updates
- A perfect example for the BFGS method
- A projection method for convex constrained monotone nonlinear equations with applications
- Adaptive scaling damped BFGS method without gradient Lipschitz continuity
- The projection technique for two open problems of unconstrained optimization problems
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The global convergence of a modified BFGS method for nonconvex functions
- A projection method for a system of nonlinear monotone equations with convex constraints
- Spectral gradient projection method for solving nonlinear monotone equations
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Quasi-Newton Methods, Motivation and Theory
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: An adaptive projection BFGS method for nonconvex unconstrained optimization problems