An adaptive s-step conjugate gradient algorithm with dynamic basis updating.
From MaRDI portal
Publication:778541
DOI10.21136/AM.2020.0136-19MaRDI QIDQ778541FDOQ778541
Authors: Erin Carson
Publication date: 2 July 2020
Published in: Applications of Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.04081
Computational methods for sparse matrices (65F50) Parallel numerical computation (65Y05) Complexity and performance of numerical algorithms (65Y20) Iterative numerical methods for linear systems (65F10)
Cites Work
- The university of Florida sparse matrix collection
- Methods of conjugate gradients for solving linear systems
- Behavior of slightly perturbed Lanczos and conjugate-gradient recurrences
- Title not available (Why is that?)
- A parallel GMRES version for general sparse matrices
- An adaptive Chebyshev iterative method for nonsymmetric linear systems based on modified moments
- The Lanczos and conjugate gradient algorithms in finite precision arithmetic
- Theory of Inexact Krylov Subspace Methods and Applications to Scientific Computing
- s-step iterative methods for symmetric linear systems
- Reducing the effect of global communication in \(\text{GMRES} (m)\) and CG on parallel distributed memory computers
- Reliable updated residuals in hybrid Bi-CG methods
- Accuracy of two three-term and three two-term recurrences for Krylov space solvers
- Hiding Global Communication Latency in the GMRES Algorithm on Massively Parallel Machines
- A residual replacement strategy for improving the maximum attainable accuracy of \(s\)-step Krylov subspace methods
- Residual Replacement Strategies for Krylov Subspace Iterative Methods for the Convergence of True Residuals
- Accuracy of the $s$-Step Lanczos Method for the Symmetric Eigenproblem in Finite Precision
- The Condition of Polynomials in Power Form
- A Newton basis GMRES implementation
- Title not available (Why is that?)
- Adaptive procedure for estimating parameters for the nonsymmetric Tchebychev iteration
- On the evaluation of polynomial coefficients
- Parallelizable restarted iterative methods for nonsymmetric linear systems. part I: Theory
- On the generation of Krylov subspace bases
- Newton interpolation at Leja points
- Estimating the Attainable Accuracy of Recursively Computed Residual Methods
- Inexact Matrix-Vector Products in Krylov Methods for Solving Linear Systems: A Relaxation Strategy
- Varying the \(s\) in your \(s\)-step GMRES
- Approximating the extreme Ritz values and upper bounds for the \(A\)-norm of the error in CG
- The Adaptive $s$-Step Conjugate Gradient Method
Cited In (5)
- An Adaptive $s$-step Conjugate Gradient Algorithm with Dynamic Basis Updating
- Adaptively restarted block Krylov subspace methods with low-synchronization skeletons
- Developing variable \(s\)-step CGNE and CGNR algorithms for non-symmetric linear systems
- Finding solution of linear systems via new forms of BiCG, BiCGstab and CGS algorithms
- Block Gram-Schmidt algorithms and their stability properties
Uses Software
This page was built for publication: An adaptive \(s\)-step conjugate gradient algorithm with dynamic basis updating.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q778541)