On the sublinear and superlinear rate of convergence of conjugate gradient methods

From MaRDI portal
Publication:5934347

DOI10.1023/A:1016694031362zbMath0972.65024OpenAlexW1628294536MaRDI QIDQ5934347

Igor E. Kaporin, Owe Axelsson

Publication date: 19 June 2001

Published in: Numerical Algorithms (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/a:1016694031362




Related Items (17)

Mesh independent superlinear convergence estimates of the conjugate gradient method for some equivalent self-adjoint operators.Conditioning of linear systems arising from penalty methodsStability analysis via condition number and effective condition number for the first kind boundary integral equations by advanced quadrature methods, a comparisonEffective condition number for weighted linear least squares problems and applications to the Trefftz methodSuperlinear PCG Algorithms: Symmetric Part Preconditioning and Boundary ConditionsGoal-Oriented Optimal Approximations of Bayesian Linear Inverse ProblemsComposite convergence bounds based on Chebyshev polynomials and finite precision conjugate gradient computationsEfficient fixed point and Newton-Krylov solvers for FFT-based homogenization of elasticity at large deformationsAccelerating the solution of linear systems appearing in two-phase reservoir simulation by the use of POD-based deflation methodsMilestones in the development of iterative solution methodsEffective condition number for numerical partial differential equationsConvergence analysis of Krylov subspace methodsReaching the superlinear convergence phase of the CG methodSymmetric Part Preconditioning of the CG Method for Stokes Type Saddle-Point SystemsIterative algorithm and estimation of solution for a fractional order differential equationError estimates for iterative algorithms for minimizing regularized quadratic subproblemsEquivalent operator preconditioning for elliptic problems




This page was built for publication: On the sublinear and superlinear rate of convergence of conjugate gradient methods