Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
From MaRDI portal
Publication:3805796
DOI10.1137/0724077zbMath0657.65083OpenAlexW2045968916MaRDI QIDQ3805796
Byrd, Richard H., Nocedal, Jorge, Ya-Xiang Yuan
Publication date: 1987
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0724077
convergencenumerical examplenonlinear optimizationminimizationBFGS methodquasi-Newton methodsquasi-Newton updatesoptimization algorithmsSuperlinear convergenceline searchesDFP methodBroyden's one-parameter class
Related Items (only showing first 100 items - show all)
A family of hybrid conjugate gradient methods for unconstrained optimization ⋮ Unnamed Item ⋮ Towards explicit superlinear convergence rate for SR1 ⋮ Non-asymptotic superlinear convergence of standard quasi-Newton methods ⋮ A statistical multivariable optimization method using improved orthogonal algorithm based on large data ⋮ A novel iterative learning control scheme based on Broyden‐class optimization method ⋮ A \(J\)-symmetric quasi-Newton method for minimax problems ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ Greedy PSB methods with explicit superlinear convergence ⋮ The regularization continuation method for optimization problems with nonlinear equality constraints ⋮ An adaptive projection BFGS method for nonconvex unconstrained optimization problems ⋮ Unnamed Item ⋮ Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato ⋮ On the Local and Superlinear Convergence of a Parameterized DFP Method ⋮ Oblique projections, Broyden restricted class and limited-memory quasi-Newton methods ⋮ New BFGS method for unconstrained optimization problem based on modified Armijo line search ⋮ A modified BFGS method and its global convergence in nonconvex minimization ⋮ On the behaviour of a combined extra-updating/self-scaling BFGS method ⋮ Convergence analysis of the Levenberg–Marquardt method ⋮ Superlinear convergence of the DFP algorithm without exact line search ⋮ Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations ⋮ Convergence of the DFP algorithm without exact line search ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Unnamed Item ⋮ Greedy Quasi-Newton Methods with Explicit Superlinear Convergence ⋮ Global convergence of the Broyden's class of quasi-Newton methods with nonmonotone linesearch ⋮ A new trust region method with adaptive radius for unconstrained optimization ⋮ A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems ⋮ Family of optimally conditioned quasi-Newton updates for unconstrained optimization ⋮ Local and superlinear convergence of quasi-Newton methods based on modified secant conditions ⋮ A new quasi-Newton algorithm ⋮ Convergence and numerical results for a parallel asynchronous quasi- Newton method ⋮ On \(q\)-BFGS algorithm for unconstrained optimization problems ⋮ A parallel quasi-Newton algorithm for unconstrained optimization ⋮ Efficent line search algorithm for unconstrained optimization ⋮ Limited-memory BFGS with displacement aggregation ⋮ Rates of superlinear convergence for classical quasi-Newton methods ⋮ An analysis of reduced Hessian methods for constrained optimization ⋮ Global convergence of the non-quasi-Newton method for unconstrained optimization problems ⋮ Modifying the BFGS method ⋮ Damped techniques for enforcing convergence of quasi-Newton methods ⋮ Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ The regularization continuation method with an adaptive time step control for linearly constrained optimization problems ⋮ Some convergence properties of descent methods ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization ⋮ A class of one parameter conjugate gradient methods ⋮ A hybrid algorithm for linearly constrained minimax problems ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ Spectral scaling BFGS method ⋮ Nonsmooth optimization via quasi-Newton methods ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ Global convergence of a modified limited memory BFGS method for non-convex minimization ⋮ A Broyden Class of Quasi-Newton Methods for Riemannian Optimization ⋮ Comparative analysis of gradient methods for source identification in a diffusion-logistic model ⋮ A new BFGS algorithm using the decomposition matrix of the correction matrix to obtain the search directions ⋮ A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ Low complexity matrix projections preserving actions on vectors ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems ⋮ On the stable global convergence of particular quasi-newton-methods ⋮ New quasi-Newton methods via higher order tensor models ⋮ A perfect example for the BFGS method ⋮ Some numerical experiments with variable-storage quasi-Newton algorithms ⋮ Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ A new class of quasi-Newton updating formulas ⋮ The hybrid BFGS-CG method in solving unconstrained optimization problems ⋮ The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions ⋮ A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties ⋮ Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm ⋮ The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients ⋮ Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search ⋮ A partitioned PSB method for partially separable unconstrained optimization problems ⋮ Analysis of sparse quasi-Newton updates with positive definite matrix completion ⋮ The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions ⋮ Block BFGS Methods ⋮ A class of diagonal quasi-Newton methods for large-scale convex minimization ⋮ Global convergence property of scaled two-step BFGS method ⋮ A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule ⋮ Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions ⋮ Using gradient directions to get global convergence of Newton-type methods ⋮ An adaptive scaled BFGS method for unconstrained optimization ⋮ A new modified BFGS method for unconstrained optimization problems ⋮ A regularized limited memory BFGS method for nonconvex unconstrained minimization ⋮ Analysis of a self-scaling quasi-Newton method ⋮ A nonmonotone PSB algorithm for solving unconstrained optimization ⋮ Sufficient descent directions in unconstrained optimization ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ Convergence property of a class of variable metric methods. ⋮ New quasi-Newton methods for unconstrained optimization problems ⋮ Convergence analysis of a modified BFGS method on convex minimizations ⋮ Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints ⋮ A variation of Broyden class methods using Householder adaptive transforms ⋮ A globally convergent BFGS method for nonlinear monotone equations without any merit functions
This page was built for publication: Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems