Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient

From MaRDI portal
Publication:6228010

arXiv1109.6058MaRDI QIDQ6228010FDOQ6228010

Hao Chen, Xiangrui Meng

Publication date: 27 September 2011

Abstract: We modify Nesterov's constant step gradient method for strongly convex functions with Lipschitz continuous gradient described in Nesterov's book. Nesterov shows that f(xk)f*leqLprodi=1k(1alphak)|x0x*|22 with alphak=sqrtho for all k, where L is the Lipschitz gradient constant and ho is the reciprocal condition number of f(x). Hence the convergence rate is 1sqrtho. In this work, we try to accelerate Nesterov's method by adaptively searching for an alphak>sqrtho at each iteration. The proposed method evaluates the gradient function at most twice per iteration and has some extra Level 1 BLAS operations. Theoretically, in the worst case, it takes the same number of iterations as Nesterov's method does but doubles the gradient calls. However, in practice, the proposed method effectively accelerates the speed of convergence for many problems including a smoothed basis pursuit denoising problem.












This page was built for publication: Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6228010)