Limited memory block Krylov subspace optimization for computing dominant singular value decompositions
From MaRDI portal
Publication:2847729
Recommendations
- Low-rank incremental methods for computing dominant singular subspaces
- The singular value decomposition: anatomy of optimizing an algorithm for extreme scale
- Lanczos, Householder transformations, and implicit deflation for fast and reliable dominant singular subspace computation
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Split-and-combine singular value decomposition for large-scale matrix
Cited in
(28)- Finding low-rank solutions via nonconvex matrix factorization, efficiently and provably
- Research on the advances of the singular value decomposition and its application in high-dimensional data mining
- Low-rank incremental methods for computing dominant singular subspaces
- Accelerating convergence by augmented Rayleigh-Ritz projections for large-scale eigenpair computation
- Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset
- Accelerating large partial EVD/SVD calculations by filtered block Davidson methods
- A brief introduction to manifold optimization
- A New First-Order Algorithmic Framework for Optimization Problems with Orthogonality Constraints
- Hierarchical optimization for neutron scattering problems
- Estimating a few extreme singular values and vectors for large-scale matrices in tensor train format
- Trace minimization method via penalty for linear response eigenvalue problems
- TRPL+K: Thick-Restart Preconditioned Lanczos+K Method for Large Symmetric Eigenvalue Problems
- A tensor train approach for internet traffic data completion
- A refinement of approximate invariant subspaces of matrices based on SVD in high dimensionality reduction and image compression
- Principal components: a descent algorithm
- A stochastic variance reduction method for PCA by an exact penalty approach
- The singular value decomposition: anatomy of optimizing an algorithm for extreme scale
- An efficient Gauss-Newton algorithm for symmetric low-rank product matrix approximations
- Limited memory restarted \(\ell^p\)-\(\ell^q\) minimization methods using generalized Krylov subspaces
- Stochastic Gauss-Newton algorithms for online PCA
- Seeking consensus on subspaces in federated principal component analysis
- Efficient proximal mapping computation for low-rank inducing norms
- Subspace methods with local refinements for eigenvalue computation using low-rank tensor-train format
- Preconditioners for nonsymmetric indefinite linear systems
- Slow and finite-time relaxations to \(m\)-bipartite consensus on the Stiefel manifold
- A Riemannian conjugate gradient method for optimization on the Stiefel manifold
- Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints
- Background subtraction using adaptive singular value decomposition
This page was built for publication: Limited memory block Krylov subspace optimization for computing dominant singular value decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2847729)