How bad are the BFGS and DFP methods when the objective function is quadratic?
From MaRDI portal
Publication:3703596
DOI10.1007/BF01582161zbMath0581.90068OpenAlexW2037258733MaRDI QIDQ3703596
Publication date: 1986
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01582161
convergence analysisunconstrained optimizationvariable metric methodscomparison of algorithmsBFGS algorithmDFP method
Related Items
Convergence properties of the Broyden-like method for mixed linear-nonlinear systems of equations ⋮ On the performance of switching BFGS/SR1 algorithms for unconstrained optimization ⋮ Modifying the BFGS update by a new column scaling technique ⋮ The least prior deviation quasi-Newton update ⋮ Local and superlinear convergence of quasi-Newton methods based on modified secant conditions ⋮ The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems ⋮ Modifying the BFGS method ⋮ Damped techniques for enforcing convergence of quasi-Newton methods ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing ⋮ An improved hybrid quantum optimization algorithm for solving nonlinear equations ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ A symmetric rank-one method based on extra updating techniques for unconstrained optimization ⋮ A Bregman extension of quasi-Newton updates. II: Analysis of robustness properties ⋮ Analysis of sparse quasi-Newton updates with positive definite matrix completion ⋮ Global convergence property of scaled two-step BFGS method ⋮ Exploiting damped techniques for nonlinear conjugate gradient methods ⋮ An adaptive scaled BFGS method for unconstrained optimization ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ Newton methods to solve a system of nonlinear algebraic equations ⋮ A nonmonotone Broyden method for unconstrained optimization ⋮ On the behaviour of a combined extra-updating/self-scaling BFGS method ⋮ Implementing and modifying Broyden class updates for large scale optimization ⋮ Wide interval for efficient self-scaling quasi-Newton algorithms ⋮ Effect of dimensionality on the Nelder–Mead simplex method ⋮ The revised DFP algorithm without exact line search ⋮ Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables ⋮ Extra updates for the bfgs method∗ ⋮ Variational quasi-Newton methods for unconstrained optimization ⋮ A CLASS OF DFP ALGORITHMS WITH REVISED SEARCH DIRECTION
Cites Work