Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
From MaRDI portal
Publication:3805796
DOI10.1137/0724077zbMath0657.65083OpenAlexW2045968916MaRDI QIDQ3805796
Byrd, Richard H., Nocedal, Jorge, Ya-Xiang Yuan
Publication date: 1987
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0724077
convergencenumerical examplenonlinear optimizationminimizationBFGS methodquasi-Newton methodsquasi-Newton updatesoptimization algorithmsSuperlinear convergenceline searchesDFP methodBroyden's one-parameter class
Related Items (only showing first 100 items - show all)
Global convergence of the Broyden's class of quasi-Newton methods with nonmonotone linesearch ⋮ A new trust region method with adaptive radius for unconstrained optimization ⋮ A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems ⋮ Family of optimally conditioned quasi-Newton updates for unconstrained optimization ⋮ Local and superlinear convergence of quasi-Newton methods based on modified secant conditions ⋮ A new quasi-Newton algorithm ⋮ Convergence and numerical results for a parallel asynchronous quasi- Newton method ⋮ On \(q\)-BFGS algorithm for unconstrained optimization problems ⋮ A parallel quasi-Newton algorithm for unconstrained optimization ⋮ Efficent line search algorithm for unconstrained optimization ⋮ Limited-memory BFGS with displacement aggregation ⋮ Rates of superlinear convergence for classical quasi-Newton methods ⋮ An analysis of reduced Hessian methods for constrained optimization ⋮ Global convergence of the non-quasi-Newton method for unconstrained optimization problems ⋮ Modifying the BFGS method ⋮ Damped techniques for enforcing convergence of quasi-Newton methods ⋮ Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ The regularization continuation method with an adaptive time step control for linearly constrained optimization problems ⋮ Some convergence properties of descent methods ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization ⋮ A class of one parameter conjugate gradient methods ⋮ A hybrid algorithm for linearly constrained minimax problems ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ Spectral scaling BFGS method ⋮ Nonsmooth optimization via quasi-Newton methods ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ Global convergence of a modified limited memory BFGS method for non-convex minimization ⋮ A Broyden Class of Quasi-Newton Methods for Riemannian Optimization ⋮ Comparative analysis of gradient methods for source identification in a diffusion-logistic model ⋮ A new BFGS algorithm using the decomposition matrix of the correction matrix to obtain the search directions ⋮ A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ Low complexity matrix projections preserving actions on vectors ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems ⋮ On the stable global convergence of particular quasi-newton-methods ⋮ New quasi-Newton methods via higher order tensor models ⋮ A perfect example for the BFGS method ⋮ Some numerical experiments with variable-storage quasi-Newton algorithms ⋮ Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ A new class of quasi-Newton updating formulas ⋮ The hybrid BFGS-CG method in solving unconstrained optimization problems ⋮ The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions ⋮ A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties ⋮ Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm ⋮ The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients ⋮ Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search ⋮ A partitioned PSB method for partially separable unconstrained optimization problems ⋮ Analysis of sparse quasi-Newton updates with positive definite matrix completion ⋮ The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions ⋮ Block BFGS Methods ⋮ A class of diagonal quasi-Newton methods for large-scale convex minimization ⋮ Global convergence property of scaled two-step BFGS method ⋮ A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule ⋮ Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions ⋮ Using gradient directions to get global convergence of Newton-type methods ⋮ An adaptive scaled BFGS method for unconstrained optimization ⋮ A new modified BFGS method for unconstrained optimization problems ⋮ A regularized limited memory BFGS method for nonconvex unconstrained minimization ⋮ Analysis of a self-scaling quasi-Newton method ⋮ A nonmonotone PSB algorithm for solving unconstrained optimization ⋮ Sufficient descent directions in unconstrained optimization ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ Convergence property of a class of variable metric methods. ⋮ New quasi-Newton methods for unconstrained optimization problems ⋮ Convergence analysis of a modified BFGS method on convex minimizations ⋮ Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints ⋮ A variation of Broyden class methods using Householder adaptive transforms ⋮ A globally convergent BFGS method for nonlinear monotone equations without any merit functions ⋮ Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems ⋮ Explicit pseudo-transient continuation and the trust-region updating strategy for unconstrained optimization ⋮ New results on superlinear convergence of classical quasi-Newton methods ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ Wide interval for efficient self-scaling quasi-Newton algorithms ⋮ A globally convergent BFGS method for nonconvex minimization without line searches ⋮ A limited memory BFGS-type method for large-scale unconstrained optimization ⋮ Unnamed Item ⋮ Numerical expirience with a class of self-scaling quasi-Newton algorithms ⋮ The revised DFP algorithm without exact line search ⋮ Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions ⋮ A globally convergent BFGS method with nonmonotone line search for non-convex minimization ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ A new backtracking inexact BFGS method for symmetric nonlinear equations ⋮ Variable-metric technique for the solution of affinely parametrized nondifferentiable optimal design problems ⋮ Unnamed Item ⋮ Globally convergent BFGS method for nonsmooth convex optimization ⋮ The convergence of Broyden algorithms for LC gradient function ⋮ A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization ⋮ Variable metric methods for unconstrained optimization and nonlinear least squares ⋮ An improved quasi-newton method for unconstrained optimization ⋮ A derivative-free line search and dfp method for symmetric equations with global and superlinear convergence ⋮ A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION ⋮ A CLASS OF DFP ALGORITHMS WITH REVISED SEARCH DIRECTION ⋮ An adaptive sizing BFGS method for unconstrained optimization
This page was built for publication: Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems