The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions

From MaRDI portal
Publication:5363380

DOI10.1137/16M1073741zbMATH Open1375.65085arXiv1605.00952MaRDI QIDQ5363380FDOQ5363380


Authors: Saverio Salzo Edit this on Wikidata


Publication date: 6 October 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: We study the variable metric forward-backward splitting algorithm for convex minimization problems without the standard assumption of the Lipschitz continuity of the gradient. In this setting, we prove that, by requiring only mild assumptions on the smooth part of the objective function and using several types of line search procedures for determining either the gradient descent stepsizes, or the relaxation parameters, one still obtains weak convergence of the iterates and convergence in the objective function values. Moreover, the o(1/k) convergence rate in the function values is obtained if slightly stronger differentiability assumptions are added. We also illustrate several applications including problems that involve Banach spaces and functions of divergence type.


Full work available at URL: https://arxiv.org/abs/1605.00952




Recommendations




Cites Work


Cited In (43)





This page was built for publication: The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5363380)