New quasi-Newton methods for unconstrained optimization problems
From MaRDI portal
Publication:2493696
DOI10.1016/j.amc.2005.08.027zbMath1100.65054WikidataQ59241598 ScholiaQ59241598MaRDI QIDQ2493696
Liqun Qi, Guoyin Li, Zeng-xin Wei
Publication date: 16 June 2006
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2005.08.027
unconstrained optimization; global convergence; numerical examples; superlinear convergence; quasi-Newton method; quasi-Newton equation; Broyden-Fletcher-Goldfarb-Shanno type algorithms
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
90C53: Methods of quasi-Newton type
Related Items
A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems, Higher order curvature information and its application in a modified diagonal Secant method, A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization, Unnamed Item, Rapidly convergent Steffensen-based methods for unconstrained optimization, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, New BFGS method for unconstrained optimization problem based on modified Armijo line search, A descent family of Dai–Liao conjugate gradient methods, On Hager and Zhang's conjugate gradient method with guaranteed descent, A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem, A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems, A modified scaling parameter for the memoryless BFGS updating formula, Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization, Scaling on the spectral gradient method, Global convergence of a modified limited memory BFGS method for non-convex minimization, Two modified scaled nonlinear conjugate gradient methods, An improved nonlinear conjugate gradient method with an optimal property, New quasi-Newton methods via higher order tensor models, Improved Hessian approximation with modified secant equations for symmetric rank-one method, An active set limited memory BFGS algorithm for bound constrained optimization, Convergence analysis of a modified BFGS method on convex minimizations, A conjugate gradient method with sufficient descent property, A modified BFGS algorithm based on a hybrid secant equation, A new type of quasi-Newton updating formulas based on the new quasi-Newton equation, Global convergence properties of two modified BFGS-type methods, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, A new conjugate gradient algorithm for training neural networks based on a modified secant equation, The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions, Two new conjugate gradient methods based on modified secant equations, Notes on the Dai-Yuan-Yuan modified spectral gradient method, A limited memory BFGS-type method for large-scale unconstrained optimization, A new backtracking inexact BFGS method for symmetric nonlinear equations, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, A new adaptive trust region algorithm for optimization problems, A new adaptive Barzilai and Borwein method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing, An effective adaptive trust region algorithm for nonsmooth minimization, A new modified BFGS method for unconstrained optimization problems, A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems, Convergence analysis of an improved BFGS method and its application in the Muskingum model, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions, A modified Newton-like method for nonlinear equations, Two-step conjugate gradient method for unconstrained optimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, Descent Perry conjugate gradient methods for systems of monotone nonlinear equations, A survey of gradient methods for solving nonlinear optimization, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, Nonmonotone adaptive trust region method with line search based on new diagonal updating, An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function, A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization, An adaptive sizing BFGS method for unconstrained optimization, A modified nonmonotone BFGS algorithm for unconstrained optimization, A modified quasi-Newton method for nonlinear equations, New line search methods for unconstrained optimization, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- An SQP-type method and its application in stochastic programs
- Local convergence analysis for partitioned quasi-Newton updates
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Variable Metric Method for Minimization
- On the Convergence of a New Conjugate Gradient Algorithm
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization