Global convergence of the restarted Lanczos and Jacobi-Davidson methods for symmetric eigenvalue problems
From MaRDI portal
Publication:500356
DOI10.1007/s00211-015-0699-4zbMath1325.65050OpenAlexW2087959096MaRDI QIDQ500356
Publication date: 2 October 2015
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00211-015-0699-4
global convergenceJacobi-Davidson methodblock Lanczos methodexact arithmeticrestarted Lanczos method
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Eigenvalues, singular values, and eigenvectors (15A18)
Related Items (6)
On global convergence of subspace projection methods for Hermitian eigenvalue problems ⋮ Convergence proof of the harmonic Ritz pairs of iterative projection methods with restart strategies for symmetric eigenvalue problems ⋮ On flexible block Chebyshev-Davidson method for solving symmetric generalized eigenvalue problems ⋮ Deflation for the Off-Diagonal Block in Symmetric Saddle Point Systems ⋮ On convergence of iterative projection methods for symmetric eigenvalue problems ⋮ Efficient semidefinite programming with approximate ADMM
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On exact estimates of the convergence rate of the steepest ascent method in the symmetric eigenvalue problem
- The iterative calculation of a few of the lowest eigenvalues and corresponding eigenvectors of large real-symmetric matrices
- An implicit restarted Lanczos method for large symmetric eigenvalue problems
- Two-sided and alternating Jacobi-Davidson
- Iterative methods for the computation of a few eigenvalues of a large symmetric matrix
- An iterative method for finding characteristic vectors of a symmetric matrix
- Thick-Restart Lanczos Method for Large Symmetric Eigenvalue Problems
- Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method
- Adaptive Projection Subspace Dimension for the Thick-Restart Lanczos Method
- Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems Under Limited Memory. Part II: Seeking Many Eigenvalues
- Sharpness in rates of convergence for the symmetric Lanczos method
- Generalizations of Davidson’s Method for Computing Eigenvalues of Sparse Symmetric Matrices
- Implementation Aspects of Band Lanczos Algorithms for Computation of Eigenvalues of Large Sparse Symmetric Matrices
- Implicit Application of Polynomial Filters in a k-Step Arnoldi Method
- The simultaneous computation of a few of the algebraically largest and smallest eigenvalues of a large, sparse, symmetric matrix
- The Davidson Method
- IRBL: An Implicitly Restarted Block-Lanczos Method for Large-Scale Hermitian Eigenproblems
- Convergence Estimates for the Generalized Davidson Method for Symmetric Eigenvalue Problems I: The Preconditioning Aspect
- Convergence Estimates for the Generalized Davidson Method for Symmetric Eigenvalue Problems II: The Subspace Acceleration
- Templates for the Solution of Algebraic Eigenvalue Problems
- Convergence of Restarted Krylov Subspaces to Invariant Subspaces
- Is Jacobi--Davidson Faster than Davidson?
- A Jacobi–Davidson Iteration Method for Linear Eigenvalue Problems
- Convergence of Polynomial Restart Krylov Methods for Eigenvalue Computations
- Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems under Limited Memory. Part I: Seeking One Eigenvalue
This page was built for publication: Global convergence of the restarted Lanczos and Jacobi-Davidson methods for symmetric eigenvalue problems