The superlinear convergence of a modified BFGS-type method for unconstrained optimization
From MaRDI portal
Publication:1771221
DOI10.1023/B:COAP.0000044184.25410.39zbMath1070.90089OpenAlexW2037114197MaRDI QIDQ1771221
Zhigang Lian, Gaohang Yu, Gong Lin Yuan, Zeng-xin Wei
Publication date: 7 April 2005
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/b:coap.0000044184.25410.39
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Numerical computation of solutions to systems of equations (65H10)
Related Items
A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ Scaling damped limited-memory updates for unconstrained optimization ⋮ A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ Rates of superlinear convergence for classical quasi-Newton methods ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ Global convergence properties of two modified BFGS-type methods ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ A method combining norm-relaxed QCQP subproblems with active set identification for inequality constrained optimization ⋮ New conjugacy condition and related new conjugate gradient methods for unconstrained optimization ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ Global convergence of a modified limited memory BFGS method for non-convex minimization ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ Towards explicit superlinear convergence rate for SR1 ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems ⋮ A modified stochastic quasi-Newton algorithm for summing functions problem in machine learning ⋮ A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ An adaptive projection BFGS method for nonconvex unconstrained optimization problems ⋮ A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions ⋮ Global convergence of a modified BFGS-type method for unconstrained non-convex minimization ⋮ The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions ⋮ On the Local and Superlinear Convergence of a Parameterized DFP Method ⋮ A regularized limited memory BFGS method for nonconvex unconstrained minimization ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ The superlinear convergence of a new quasi-Newton-SQP method for constrained optimization ⋮ A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ Convergence analysis of a modified BFGS method on convex minimizations ⋮ Notes on the Dai-Yuan-Yuan modified spectral gradient method ⋮ New results on superlinear convergence of classical quasi-Newton methods ⋮ New line search methods for unconstrained optimization ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ A limited memory BFGS-type method for large-scale unconstrained optimization ⋮ A new trust region method with adaptive radius ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration ⋮ A new backtracking inexact BFGS method for symmetric nonlinear equations ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ A new type of quasi-Newton updating formulas based on the new quasi-Newton equation ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ Greedy Quasi-Newton Methods with Explicit Superlinear Convergence ⋮ A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization