Fast alternating linearization methods for minimizing the sum of two convex functions

From MaRDI portal
Publication:378095

DOI10.1007/S10107-012-0530-2zbMATH Open1280.65051arXiv0912.4571OpenAlexW2010286849WikidataQ101200642 ScholiaQ101200642MaRDI QIDQ378095FDOQ378095


Authors: Donald Goldfarb, Katya Scheinberg, Shiqian Ma Edit this on Wikidata


Publication date: 11 November 2013

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We present in this paper first-order alternating linearization algorithms based on an alternating direction augmented Lagrangian approach for minimizing the sum of two convex functions. Our basic methods require at most O(1/epsilon) iterations to obtain an epsilon-optimal solution, while our accelerated (i.e., fast) versions of them require at most O(1/sqrtepsilon) iterations, with little change in the computational effort required at each iteration. For both types of methods, we present one algorithm that requires both functions to be smooth with Lipschitz continuous gradients and one algorithm that needs only one of the functions to be so. Algorithms in this paper are Gauss-Seidel type methods, in contrast to the ones proposed by Goldfarb and Ma in [21] where the algorithms are Jacobi type methods. Numerical results are reported to support our theoretical conclusions and demonstrate the practical potential of our algorithms.


Full work available at URL: https://arxiv.org/abs/0912.4571




Recommendations




Cites Work


Cited In (72)

Uses Software





This page was built for publication: Fast alternating linearization methods for minimizing the sum of two convex functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q378095)