Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
From MaRDI portal
Publication:1383833
DOI10.1023/A:1018315205474zbMath0904.90127MaRDI QIDQ1383833
Publication date: 19 January 1999
Published in: Computational Optimization and Applications (Search for Journal in Brave)
unconstrained optimization; convergence properties; inexact line searches; Broyden's class; self-scaling quasi-Newton methods
Related Items
On the behaviour of a combined extra-updating/self-scaling BFGS method, Non-asymptotic superlinear convergence of standard quasi-Newton methods, Scaling damped limited-memory updates for unconstrained optimization, A combined class of self-scaling and modified quasi-Newton methods, A class of diagonal quasi-Newton methods for large-scale convex minimization, Spectral scaling BFGS method, An adaptive scaled BFGS method for unconstrained optimization, Numerical expirience with a class of self-scaling quasi-Newton algorithms, A descent hybrid conjugate gradient method based on the memoryless BFGS update, A double parameter scaled BFGS method for unconstrained optimization, Global convergence property of scaled two-step BFGS method, A variation of Broyden class methods using Householder adaptive transforms, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, An adaptive sizing BFGS method for unconstrained optimization, The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems, Computational experiments with scaled initial hessian approximation for the broyden family methods∗, Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems, Wide interval for efficient self-scaling quasi-Newton algorithms