The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
From MaRDI portal
Publication:928232
Recommendations
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions
- Convergence analysis of a modified BFGS method on convex minimizations
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- A modified BFGS method and its global convergence in nonconvex minimization
Cites work
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 1131990 (Why is no real title available?)
- scientific article; zbMATH DE number 3453051 (Why is no real title available?)
- A New Algorithm for Unconstrained Optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- An SQP-type method and its application in stochastic programs
- Extended global convergence framework for unconstrained optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- Global convergence of the Broyden's class of quasi-Newton methods with nonmonotone linesearch
- Local convergence analysis for partitioned quasi-Newton updates
- New quasi-Newton methods for unconstrained optimization problems
- On the Convergence of the Variable Metric Algorithm
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Über die globale Konvergenz von Variable-Metrik-Verfahren mit nicht- exakter Schrittweitenbestimmung
Cited in
(15)- A conjugate gradient method with descent direction for unconstrained optimization
- A modified nonmonotone BFGS algorithm for solving smooth nonlinear equations
- A trust region algorithm with conjugate gradient technique for optimization problems
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions
- A new adaptive trust region algorithm for optimization problems
- A New Method with Descent Property for Symmetric Nonlinear Equations
- A BFGS algorithm for solving symmetric nonlinear equations
- Global convergence of a cautious projection BFGS algorithm for nonconvex problems without gradient Lipschitz continuity
- An active set limited memory BFGS algorithm for bound constrained optimization
- Global and local convergence of a new affine scaling trust region algorithm for linearly constrained optimization
- A quasi-Newton algorithm for large-scale nonlinear equations
- BFGS trust-region method for symmetric nonlinear equations
- Convergence analysis of a modified BFGS method on convex minimizations
This page was built for publication: The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q928232)