Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient
From MaRDI portal
Publication:6228010
arXiv1109.6058MaRDI QIDQ6228010FDOQ6228010
Publication date: 27 September 2011
Abstract: We modify Nesterov's constant step gradient method for strongly convex functions with Lipschitz continuous gradient described in Nesterov's book. Nesterov shows that with for all , where is the Lipschitz gradient constant and is the reciprocal condition number of . Hence the convergence rate is . In this work, we try to accelerate Nesterov's method by adaptively searching for an at each iteration. The proposed method evaluates the gradient function at most twice per iteration and has some extra Level 1 BLAS operations. Theoretically, in the worst case, it takes the same number of iterations as Nesterov's method does but doubles the gradient calls. However, in practice, the proposed method effectively accelerates the speed of convergence for many problems including a smoothed basis pursuit denoising problem.
This page was built for publication: Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6228010)