Alternating minimization methods for strongly convex optimization

From MaRDI portal
Publication:2232092

DOI10.1515/JIIP-2020-0074zbMATH Open1478.90092arXiv1911.08987OpenAlexW3155282024MaRDI QIDQ2232092FDOQ2232092


Authors: Nazarii Tupitsa, Pavel Dvurechensky, Alexander V. Gasnikov, Sergey Guminov Edit this on Wikidata


Publication date: 4 October 2021

Published in: Journal of Inverse and Ill-posed Problems (Search for Journal in Brave)

Abstract: {We consider alternating minimization procedures for convex optimization problems with variable divided in many block, each block being amenable for minimization with respect to its variable with freezed other variables blocks. In the case of two blocks, we prove a linear convergence rate for alternating minimization procedure under Polyak-Lojasiewicz condition, which can be seen as a relaxation of the strong convexity assumption. Under strong convexity assumption in many-blocks setting we provide an accelerated alternating minimization procedure with linear rate depending on the square root of the condition number as opposed to condition number for the non-accelerated method. We also mention an approximating non-negative solution to a linear system of equations Ax=y with alternating minimization of Kullback-Leibler (KL) divergence between Ax and y.


Full work available at URL: https://arxiv.org/abs/1911.08987




Recommendations




Cites Work


Cited In (13)





This page was built for publication: Alternating minimization methods for strongly convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2232092)