Sequential convergence of AdaGrad algorithm for smooth convex optimization

From MaRDI portal
Publication:6354489

DOI10.1016/J.ORL.2021.04.011arXiv2011.12341MaRDI QIDQ6354489FDOQ6354489


Authors: Cheik Traoré, Edouard Pauwels Edit this on Wikidata


Publication date: 24 November 2020

Abstract: We prove that the iterates produced by, either the scalar step size variant, or the coordinatewise variant of AdaGrad algorithm, are convergent sequences when applied to convex objective functions with Lipschitz gradient. The key insight is to remark that such AdaGrad sequences satisfy a variable metric quasi-Fej'er monotonicity property, which allows to prove convergence.













This page was built for publication: Sequential convergence of AdaGrad algorithm for smooth convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6354489)