An inexact accelerated stochastic ADMM for separable convex optimization

From MaRDI portal
Publication:2114819

DOI10.1007/S10589-021-00338-8zbMATH Open1487.90521arXiv2010.12765OpenAlexW3094257649MaRDI QIDQ2114819FDOQ2114819

Jianchao Bai, Hongchao Zhang, William Hager

Publication date: 15 March 2022

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Abstract: An inexact accelerated stochastic Alternating Direction Method of Multipliers (AS-ADMM) scheme is developed for solving structured separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and a smooth function which is an average of many component convex functions. Problems having this structure often arise in machine learning and data mining applications. AS-ADMM combines the ideas of both ADMM and the stochastic gradient methods using variance reduction techniques. One of the ADMM subproblems employs a linearization technique while a similar linearization could be introduced for the other subproblem. For a specified choice of the algorithm parameters, it is shown that the objective error and the constraint violation are mathcalO(1/k) relative to the number of outer iterations k. Under a strong convexity assumption, the expected iterate error converges to zero linearly. A linearized variant of AS-ADMM and incremental sampling strategies are also discussed. Numerical experiments with both stochastic and deterministic ADMM algorithms show that AS-ADMM can be particularly effective for structured optimization arising in big data applications.


Full work available at URL: https://arxiv.org/abs/2010.12765




Recommendations




Cites Work


Cited In (29)

Uses Software





This page was built for publication: An inexact accelerated stochastic ADMM for separable convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2114819)