Alternating minimization methods for strongly convex optimization

From MaRDI portal
Publication:2232092




Abstract: {We consider alternating minimization procedures for convex optimization problems with variable divided in many block, each block being amenable for minimization with respect to its variable with freezed other variables blocks. In the case of two blocks, we prove a linear convergence rate for alternating minimization procedure under Polyak-Lojasiewicz condition, which can be seen as a relaxation of the strong convexity assumption. Under strong convexity assumption in many-blocks setting we provide an accelerated alternating minimization procedure with linear rate depending on the square root of the condition number as opposed to condition number for the non-accelerated method. We also mention an approximating non-negative solution to a linear system of equations Ax=y with alternating minimization of Kullback-Leibler (KL) divergence between Ax and y.



Cites work







This page was built for publication: Alternating minimization methods for strongly convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2232092)