Constructing low-rank Tucker tensor approximations using generalized completion
From MaRDI portal
Publication:6489425
Recommendations
- Tucker factorization with missing data with application to low-n-rank tensor completion
- Low-rank tensor completion by Riemannian optimization
- Cross: efficient low-rank tensor completion
- Recovering low CP/Tucker ranked tensors, with applications in tensor completion
- On polynomial time methods for exact low-rank tensor completion
Cites work
- A Multilinear Singular Value Decomposition
- A simpler approach to matrix completion
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Compressive Multiplexing of Correlated Signals
- Convex multi-task feature learning
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- Guarantees of Riemannian optimization for low rank matrix recovery
- Low-rank approximation algorithms for matrix completion with random sampling
- Low-rank matrix completion by Riemannian optimization
- Matrix completion by singular value thresholding: sharp bounds
- Matrix completion with sparse measurement errors
- The fast Johnson-Lindenstrauss transform and approximate nearest neighbors
Cited in
(1)
This page was built for publication: Constructing low-rank Tucker tensor approximations using generalized completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6489425)