Smoothing technique for nonsmooth composite minimization with linear operator

From MaRDI portal
Publication:6288024

arXiv1706.05837MaRDI QIDQ6288024FDOQ6288024

Volkan Cevher, Olivier Fercoq, Quang Van Nguyen

Publication date: 19 June 2017

Abstract: We introduce and analyze an algorithm for the minimization of convex functions that are the sum of differentiable terms and proximable terms composed with linear operators. The method builds upon the recently developed smoothed gap technique. In addition to a precise convergence rate result, valid even in the presence of linear inclusion constraints, this new method allows an explicit treatment of the gradient of differentiable functions and can be enhanced with line-search. We also study the consequences of restarting the acceleration of the algorithm at a given frequency. These new features are not classical for primal-dual methods and allow us to solve difficult large-scale convex optimization problems. We numerically illustrate the superior performance of the algorithm on basis pursuit, TV-regularized least squares regression and L1 regression problems against the state-of-the-art.












This page was built for publication: Smoothing technique for nonsmooth composite minimization with linear operator

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6288024)