Three-point Step Size Gradient Method with Relaxed Generalized Armijo Step Size Rule

From MaRDI portal
Publication:6402009

arXiv2206.06631MaRDI QIDQ6402009FDOQ6402009

Zhao Xu, Wang Jian, Sun Qingying

Publication date: 14 June 2022

Abstract: Based on differences of points and differences of gradients over the most recent three iterations, together with the Taylor's theorem, two forms of the quasi-Newton equations at the recent iteration are constructed. By using the two forms of the quasi-Newton equation and the method of least squares, three-point step size gradient methods for solving unconstrained optimization problem are proposed. It is proved by using the relaxed generalized Armijo step size rule that the new method is of global convergence properties if the gradient function is uniformly continuous. Moreover, it is shown that, when the objective function is pseudo-convex (quasi-convex) function, the new method has strong convergence results. In addition, it is also shown under some suitable assumptions that the new method is of super-linear and linear convergence. Although multi-piont information is used, TBB has the feature of simplicity, low memory requirement and only first order information being used, the new method is very suitable for solving large-scale optimization problems. Numerical experiments are provided and the efficiency, robustness and analysis of TBB are confirmed.













This page was built for publication: Three-point Step Size Gradient Method with Relaxed Generalized Armijo Step Size Rule

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6402009)