Stopping criteria for, and strong convergence of, stochastic gradient descent on Bottou-Curtis-Nocedal functions

From MaRDI portal
Publication:2089787

DOI10.1007/S10107-021-01710-6zbMATH Open1505.65219arXiv2004.00475OpenAlexW3208582873MaRDI QIDQ2089787FDOQ2089787


Authors: Vivak Patel Edit this on Wikidata


Publication date: 24 October 2022

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: Stopping criteria for Stochastic Gradient Descent (SGD) methods play important roles from enabling adaptive step size schemes to providing rigor for downstream analyses such as asymptotic inference. Unfortunately, current stopping criteria for SGD methods are often heuristics that rely on asymptotic normality results or convergence to stationary distributions, which may fail to exist for nonconvex functions and, thereby, limit the applicability of such stopping criteria. To address this issue, in this work, we rigorously develop two stopping criteria for SGD that can be applied to a broad class of nonconvex functions, which we term Bottou-Curtis-Nocedal functions. Moreover, as a prerequisite for developing these stopping criteria, we prove that the gradient function evaluated at SGD's iterates converges strongly to zero for Bottou-Curtis-Nocedal functions, which addresses an open question in the SGD literature. As a result of our work, our rigorously developed stopping criteria can be used to develop new adaptive step size schemes or bolster other downstream analyses for nonconvex functions.


Full work available at URL: https://arxiv.org/abs/2004.00475




Recommendations




Cites Work


Cited In (5)





This page was built for publication: Stopping criteria for, and strong convergence of, stochastic gradient descent on Bottou-Curtis-Nocedal functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2089787)