GMRES-accelerated ADMM for quadratic objectives

From MaRDI portal
Publication:4554067

DOI10.1137/16M1059941zbMATH Open1402.49025arXiv1601.06200OpenAlexW3105492150MaRDI QIDQ4554067FDOQ4554067


Authors: Richard Zhang, J. K. White Edit this on Wikidata


Publication date: 7 November 2018

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: We consider the sequence acceleration problem for the alternating direction method-of-multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a kappa-conditioned problem in O(sqrtkappa) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(kappa1/4) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(sqrtkappa) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k2), instead of O(1/k), where k is the iteration index.


Full work available at URL: https://arxiv.org/abs/1601.06200




Recommendations




Cites Work


Cited In (5)

Uses Software





This page was built for publication: GMRES-accelerated ADMM for quadratic objectives

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4554067)