New quasi-Newton methods for unconstrained optimization problems
From MaRDI portal
Publication:2493696
DOI10.1016/j.amc.2005.08.027zbMath1100.65054WikidataQ59241598 ScholiaQ59241598MaRDI QIDQ2493696
Liqun Qi, Guoyin Li, Zeng-xin Wei
Publication date: 16 June 2006
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2005.08.027
unconstrained optimization; global convergence; numerical examples; superlinear convergence; quasi-Newton method; quasi-Newton equation; Broyden-Fletcher-Goldfarb-Shanno type algorithms
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
90C53: Methods of quasi-Newton type
Related Items
New BFGS method for unconstrained optimization problem based on modified Armijo line search, A descent family of Dai–Liao conjugate gradient methods, Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization, Scaling on the spectral gradient method, Global convergence of a modified limited memory BFGS method for non-convex minimization, Two modified scaled nonlinear conjugate gradient methods, An improved nonlinear conjugate gradient method with an optimal property, New quasi-Newton methods via higher order tensor models, Improved Hessian approximation with modified secant equations for symmetric rank-one method, An active set limited memory BFGS algorithm for bound constrained optimization, Convergence analysis of a modified BFGS method on convex minimizations, A modified BFGS algorithm based on a hybrid secant equation, Global convergence properties of two modified BFGS-type methods, The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions, Two new conjugate gradient methods based on modified secant equations, Notes on the Dai-Yuan-Yuan modified spectral gradient method, A limited memory BFGS-type method for large-scale unconstrained optimization, A new backtracking inexact BFGS method for symmetric nonlinear equations, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, New line search methods for unconstrained optimization, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- An SQP-type method and its application in stochastic programs
- Local convergence analysis for partitioned quasi-Newton updates
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Variable Metric Method for Minimization
- On the Convergence of a New Conjugate Gradient Algorithm
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization