An adaptive projection BFGS method for nonconvex unconstrained optimization problems
From MaRDI portal
Publication:6202792
Recommendations
- The global convergence of a modified BFGS method for nonconvex functions
- Global convergence of a cautious projection BFGS algorithm for nonconvex problems without gradient Lipschitz continuity
- A modified non-monotone BFGS method for non-convex unconstrained optimization
- The projection technique for two open problems of unconstrained optimization problems
- A nonmonotone modified BFGS algorithm for nonconvex unconstrained optimization problems
Cites work
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A New Algorithm for Unconstrained Optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A modified BFGS method and its global convergence in nonconvex minimization
- A new approach to variable metric algorithms
- A perfect example for the BFGS method
- A projection method for a system of nonlinear monotone equations with convex constraints
- A projection method for convex constrained monotone nonlinear equations with applications
- Adaptive scaling damped BFGS method without gradient Lipschitz continuity
- An adaptive scaled BFGS method for unconstrained optimization
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- Conditioning of Quasi-Newton Methods for Function Minimization
- Convergence analysis of a modified BFGS method on convex minimizations
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Local convergence analysis for partitioned quasi-Newton updates
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Quasi-Newton Methods, Motivation and Theory
- Spectral gradient projection method for solving nonlinear monotone equations
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The global convergence of a modified BFGS method for nonconvex functions
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The projection technique for two open problems of unconstrained optimization problems
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
This page was built for publication: An adaptive projection BFGS method for nonconvex unconstrained optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6202792)