Global convergence of the restarted Lanczos and Jacobi-Davidson methods for symmetric eigenvalue problems
From MaRDI portal
(Redirected from Publication:500356)
Recommendations
- Restarting techniques for the (Jacobi-)Davidson symmetric eigenvalue method
- On global convergence of subspace projection methods for Hermitian eigenvalue problems
- Convergence estimates of nonrestarted and restarted block-Lanczos methods.
- An implicit restarted Lanczos method for large symmetric eigenvalue problems
- scientific article; zbMATH DE number 6452806
Cites work
- scientific article; zbMATH DE number 3671573 (Why is no real title available?)
- scientific article; zbMATH DE number 3633705 (Why is no real title available?)
- scientific article; zbMATH DE number 1049347 (Why is no real title available?)
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- A Jacobi–Davidson Iteration Method for Linear Eigenvalue Problems
- Adaptive projection subspace dimension for the thick-restart Lanczos method
- An implicit restarted Lanczos method for large symmetric eigenvalue problems
- An iterative method for finding characteristic vectors of a symmetric matrix
- Convergence Estimates for the Generalized Davidson Method for Symmetric Eigenvalue Problems I: The Preconditioning Aspect
- Convergence Estimates for the Generalized Davidson Method for Symmetric Eigenvalue Problems II: The Subspace Acceleration
- Convergence of Polynomial Restart Krylov Methods for Eigenvalue Computations
- Convergence of Restarted Krylov Subspaces to Invariant Subspaces
- Generalizations of Davidson’s Method for Computing Eigenvalues of Sparse Symmetric Matrices
- IRBL: An Implicitly Restarted Block-Lanczos Method for Large-Scale Hermitian Eigenproblems
- Implementation Aspects of Band Lanczos Algorithms for Computation of Eigenvalues of Large Sparse Symmetric Matrices
- Implicit Application of Polynomial Filters in a k-Step Arnoldi Method
- Is Jacobi--Davidson Faster than Davidson?
- Iterative methods for the computation of a few eigenvalues of a large symmetric matrix
- Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems Under Limited Memory. Part II: Seeking Many Eigenvalues
- Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems under Limited Memory. Part I: Seeking One Eigenvalue
- On exact estimates of the convergence rate of the steepest ascent method in the symmetric eigenvalue problem
- Sharpness in rates of convergence for the symmetric Lanczos method
- Templates for the Solution of Algebraic Eigenvalue Problems
- The Davidson Method
- The iterative calculation of a few of the lowest eigenvalues and corresponding eigenvectors of large real-symmetric matrices
- The simultaneous computation of a few of the algebraically largest and smallest eigenvalues of a large, sparse, symmetric matrix
- Thick-restart Lanczos method for large symmetric eigenvalue problems
- Toward the optimal preconditioned eigensolver: Locally optimal block preconditioned conjugate gradient method
- Two-sided and alternating Jacobi-Davidson
Cited in
(7)- On convergence of iterative projection methods for symmetric eigenvalue problems
- Deflation for the Off-Diagonal Block in Symmetric Saddle Point Systems
- On flexible block Chebyshev-Davidson method for solving symmetric generalized eigenvalue problems
- On global convergence of subspace projection methods for Hermitian eigenvalue problems
- Convergence proof of the harmonic Ritz pairs of iterative projection methods with restart strategies for symmetric eigenvalue problems
- Convergence estimates of nonrestarted and restarted block-Lanczos methods.
- Efficient semidefinite programming with approximate ADMM
This page was built for publication: Global convergence of the restarted Lanczos and Jacobi-Davidson methods for symmetric eigenvalue problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q500356)