The superlinear convergence of a modified BFGS-type method for unconstrained optimization

From MaRDI portal
Publication:1771221

DOI10.1023/B:COAP.0000044184.25410.39zbMath1070.90089OpenAlexW2037114197MaRDI QIDQ1771221

Zhigang Lian, Gaohang Yu, Gong Lin Yuan, Zeng-xin Wei

Publication date: 7 April 2005

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/b:coap.0000044184.25410.39




Related Items

A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problemA modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problemsScaling damped limited-memory updates for unconstrained optimizationA modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problemsNew nonlinear conjugate gradient methods based on optimal Dai-Liao parametersRates of superlinear convergence for classical quasi-Newton methodsNonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty modelA hybrid quasi-Newton method with application in sparse recoveryGlobal convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimizationA modified nonmonotone BFGS algorithm for unconstrained optimizationThe global convergence of a modified BFGS method for nonconvex functionsGlobal convergence of a modified Broyden family method for nonconvex functionsGlobal convergence properties of two modified BFGS-type methodsA double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimizationA method combining norm-relaxed QCQP subproblems with active set identification for inequality constrained optimizationNew conjugacy condition and related new conjugate gradient methods for unconstrained optimizationA double parameter scaled BFGS method for unconstrained optimizationGlobal convergence of a modified limited memory BFGS method for non-convex minimizationTwo modified scaled nonlinear conjugate gradient methodsTowards explicit superlinear convergence rate for SR1Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functionsThe projection technique for two open problems of unconstrained optimization problemsThe convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systemsA modified stochastic quasi-Newton algorithm for summing functions problem in machine learningA hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problemsCompetitive secant (BFGS) methods based on modified secant relations for unconstrained optimizationAn adaptive projection BFGS method for nonconvex unconstrained optimization problemsA Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization ProblemsAn active set limited memory BFGS algorithm for bound constrained optimizationGlobally convergent modified Perry's conjugate gradient methodA modified secant equation quasi-Newton method for unconstrained optimizationA combined class of self-scaling and modified quasi-Newton methodsA new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective functionThe superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functionsGlobal convergence of a modified BFGS-type method for unconstrained non-convex minimizationThe global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functionsOn the Local and Superlinear Convergence of a Parameterized DFP MethodA regularized limited memory BFGS method for nonconvex unconstrained minimizationA modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programsThe superlinear convergence of a new quasi-Newton-SQP method for constrained optimizationA modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problemsConvergence analysis of an improved BFGS method and its application in the Muskingum modelGlobal convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimizationConvergence analysis of a modified BFGS method on convex minimizationsNotes on the Dai-Yuan-Yuan modified spectral gradient methodNew results on superlinear convergence of classical quasi-Newton methodsNew line search methods for unconstrained optimizationGlobal convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line searchA limited memory BFGS-type method for large-scale unconstrained optimizationA new trust region method with adaptive radiusA modified BFGS algorithm based on a hybrid secant equationA conjugate gradient algorithm and its application in large-scale optimization problems and image restorationA new backtracking inexact BFGS method for symmetric nonlinear equationsNonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimizationA new type of quasi-Newton updating formulas based on the new quasi-Newton equationA new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective functionDiagonally scaled memoryless quasi-Newton methods with application to compressed sensingGreedy Quasi-Newton Methods with Explicit Superlinear ConvergenceA Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization