LSOS: Line-search Second-Order Stochastic optimization methods for nonconvex finite sums

From MaRDI portal
Publication:6346231

DOI10.1090/MCOM/3802zbMATH Open1511.65048arXiv2007.15966MaRDI QIDQ6346231FDOQ6346231


Authors: D. di Serafino, Nataša Krejić, Nataša Krklec Jerinkić, Marco Viola Edit this on Wikidata


Publication date: 31 July 2020

Abstract: We develop a line-search second-order algorithmic framework for minimizing finite sums. We do not make any convexity assumptions, but require the terms of the sum to be continuously differentiable and have Lipschitz-continuous gradients. The methods fitting into this framework combine line searches and suitably decaying step lengths. A key issue is a two-step sampling at each iteration, which allows us to control the error present in the line-search procedure. Stationarity of limit points is proved in the almost-sure sense, while almost-sure convergence of the sequence of approximations to the solution holds with the additional hypothesis that the functions are strongly convex. Numerical experiments, including comparisons with state-of-the art stochastic optimization methods, show the efficiency of our approach.













This page was built for publication: LSOS: Line-search Second-Order Stochastic optimization methods for nonconvex finite sums

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6346231)