Limited memory block Krylov subspace optimization for computing dominant singular value decompositions
DOI10.1137/120871328zbMATH Open1278.65045OpenAlexW1968157422MaRDI QIDQ2847729FDOQ2847729
Authors: Xin Liu, Zaiwen Wen, Yin Zhang
Publication date: 11 September 2013
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/120871328
Recommendations
- Low-rank incremental methods for computing dominant singular subspaces
- The singular value decomposition: anatomy of optimizing an algorithm for extreme scale
- Lanczos, Householder transformations, and implicit deflation for fast and reliable dominant singular subspace computation
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Split-and-combine singular value decomposition for large-scale matrix
data miningdimension reductionprincipal component analysisconvergenceeigenvalue decompositionnumerical examplesalgorithmiteration methodKrylov subspace methodsubspace optimizationdominant singular value decompositionlarge and dense matrices
Numerical mathematical programming methods (65K05) Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Numerical solutions to overdetermined systems, pseudoinverses (65F20) Nonconvex programming, global optimization (90C26)
Cited In (27)
- A New First-Order Algorithmic Framework for Optimization Problems with Orthogonality Constraints
- Background subtraction using adaptive singular value decomposition
- Preconditioners for nonsymmetric indefinite linear systems
- Low-rank incremental methods for computing dominant singular subspaces
- An efficient Gauss-Newton algorithm for symmetric low-rank product matrix approximations
- Limited memory restarted \(\ell^p\)-\(\ell^q\) minimization methods using generalized Krylov subspaces
- Finding low-rank solutions via nonconvex matrix factorization, efficiently and provably
- Research on the advances of the singular value decomposition and its application in high-dimensional data mining
- TRPL+K: Thick-Restart Preconditioned Lanczos+K Method for Large Symmetric Eigenvalue Problems
- Slow and finite-time relaxations to \(m\)-bipartite consensus on the Stiefel manifold
- A tensor train approach for internet traffic data completion
- A brief introduction to manifold optimization
- Hierarchical optimization for neutron scattering problems
- Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints
- Seeking consensus on subspaces in federated principal component analysis
- Accelerating large partial EVD/SVD calculations by filtered block Davidson methods
- Stochastic Gauss-Newton algorithms for online PCA
- Principal components: a descent algorithm
- A Riemannian conjugate gradient method for optimization on the Stiefel manifold
- Efficient proximal mapping computation for low-rank inducing norms
- Trace minimization method via penalty for linear response eigenvalue problems
- A stochastic variance reduction method for PCA by an exact penalty approach
- Subspace methods with local refinements for eigenvalue computation using low-rank tensor-train format
- The singular value decomposition: anatomy of optimizing an algorithm for extreme scale
- Accelerating convergence by augmented Rayleigh-Ritz projections for large-scale eigenpair computation
- Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset
- Estimating a few extreme singular values and vectors for large-scale matrices in tensor train format
Uses Software
This page was built for publication: Limited memory block Krylov subspace optimization for computing dominant singular value decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2847729)