s-step enlarged Krylov subspace conjugate gradient methods

From MaRDI portal
Publication:5208740

DOI10.1137/18M1182528zbMATH Open1429.65071arXiv1804.10629MaRDI QIDQ5208740FDOQ5208740


Authors: Sophie Moufawad Edit this on Wikidata


Publication date: 10 January 2020

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Abstract: Recently, enlarged Krylov subspace methods, that consists of enlarging the Krylov subspace by a maximum of t vectors per iteration based on the domain decomposition of the graph of A, were introduced in the aim of reducing communication when solving systems of linear equations Ax=b. In this paper, the s-step enlarged Krylov subspace Conjugate Gradient methods are introduced, whereby s iterations of the enlarged Conjugate Gradient methods are merged in one iteration. The numerical stability of these s-step methods is studied, and several numerically stable versions are proposed. Similarly to the enlarged Krylov subspace methods, the s-step enlarged Krylov subspace methods have a faster convergence than Krylov methods, in terms of iterations. Moreover, by computing st basis vectors of the enlarged Krylov subspace mathscrKk,t(A,r0) at the beginning of each s-step iteration, communication is further reduced. It is shown in this paper that the introduced methods are parallelizable with less communication, with respect to their corresponding enlarged versions and to Conjugate Gradient.


Full work available at URL: https://arxiv.org/abs/1804.10629




Recommendations




Cites Work


Cited In (2)

Uses Software





This page was built for publication: s-step enlarged Krylov subspace conjugate gradient methods

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5208740)