Super-Acceleration with Cyclical Step-sizes

From MaRDI portal
Publication:6370588

arXiv2106.09687MaRDI QIDQ6370588FDOQ6370588

Adrien B. Taylor, Baptiste Goujaud, Damien Scieur, Aymeric Dieuleveut, Fabian Pedregosa

Publication date: 17 June 2021

Abstract: We develop a convergence-rate analysis of momentum with cyclical step-sizes. We show that under some assumption on the spectral gap of Hessians in machine learning, cyclical step-sizes are provably faster than constant step-sizes. More precisely, we develop a convergence rate analysis for quadratic objectives that provides optimal parameters and shows that cyclical learning rates can improve upon traditional lower complexity bounds. We further propose a systematic approach to design optimal first order methods for quadratic minimization with a given spectral structure. Finally, we provide a local convergence rate analysis beyond quadratic minimization for the proposed methods and illustrate our findings through benchmarks on least squares and logistic regression problems.




Has companion code repository: https://github.com/google/jaxopt









This page was built for publication: Super-Acceleration with Cyclical Step-sizes

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6370588)