Constructing low-rank Tucker tensor approximations using generalized completion
From MaRDI portal
Publication:6489425
DOI10.1515/RNAM-2024-0010MaRDI QIDQ6489425
Publication date: 22 April 2024
Published in: Russian Journal of Numerical Analysis and Mathematical Modelling (Search for Journal in Brave)
Factorization of matrices (15A23) Complexity and performance of numerical algorithms (65Y20) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Cites Work
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Matrix completion by singular value thresholding: sharp bounds
- Convex multi-task feature learning
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Low-rank approximation algorithms for matrix completion with random sampling
- Matrix completion with sparse measurement errors
- Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
- Low-Rank Matrix Completion by Riemannian Optimization
- Compressive Multiplexing of Correlated Signals
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- A Multilinear Singular Value Decomposition
- An elementary proof of a theorem of Johnson and Lindenstrauss
- The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors
- A Simpler Approach to Matrix Completion
This page was built for publication: Constructing low-rank Tucker tensor approximations using generalized completion