Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization

From MaRDI portal
Publication:2235149

DOI10.1007/S10107-020-01534-WzbMATH Open1478.90097arXiv1807.00387OpenAlexW3040672097MaRDI QIDQ2235149FDOQ2235149


Authors: Szilárd László Edit this on Wikidata


Publication date: 20 October 2021

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We investigate an inertial algorithm of gradient type in connection with the minimization of a nonconvex differentiable function. The algorithm is formulated in the spirit of Nesterov's accelerated convex gradient method. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satisfies the Kurdyka-{L}ojasiewicz property. Further, we provide convergence rates for the generated sequences and the function values formulated in terms of the {L}ojasiewicz exponent.


Full work available at URL: https://arxiv.org/abs/1807.00387




Recommendations




Cites Work


Cited In (18)





This page was built for publication: Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2235149)