A sub-sampled tensor method for nonconvex optimization

From MaRDI portal
Publication:6190818

DOI10.1093/IMANUM/DRAC057arXiv1911.10367OpenAlexW4302760397MaRDI QIDQ6190818FDOQ6190818

Jonas Köhler, Author name not available (Why is that?)

Publication date: 6 February 2024

Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)

Abstract: We present a stochastic optimization method that uses a fourth-order regularized model to find local minima of smooth and potentially non-convex objective functions with a finite-sum structure. This algorithm uses sub-sampled derivatives instead of exact quantities. The proposed approach is shown to find an (epsilon1,epsilon2,epsilon3)-third-order critical point in at most iterations, thereby matching the rate of deterministic approaches. In order to prove this result, we derive a novel tensor concentration inequality for sums of tensors of any order that makes explicit use of the finite-sum structure of the objective function.


Full work available at URL: https://arxiv.org/abs/1911.10367






Cited In (1)






This page was built for publication: A sub-sampled tensor method for nonconvex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6190818)