Constructing low-rank Tucker tensor approximations using generalized completion
From MaRDI portal
Publication:6489425
DOI10.1515/RNAM-2024-0010MaRDI QIDQ6489425FDOQ6489425
Authors: S. V. Petrov
Publication date: 22 April 2024
Published in: Russian Journal of Numerical Analysis and Mathematical Modelling (Search for Journal in Brave)
Recommendations
- Tucker factorization with missing data with application to low-\(n\)-rank tensor completion
- Low-rank tensor completion by Riemannian optimization
- Cross: efficient low-rank tensor completion
- Recovering low CP/Tucker ranked tensors, with applications in tensor completion
- On polynomial time methods for exact low-rank tensor completion
Complexity and performance of numerical algorithms (65Y20) Factorization of matrices (15A23) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Cites Work
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Matrix completion by singular value thresholding: sharp bounds
- A Multilinear Singular Value Decomposition
- Low-rank matrix completion by Riemannian optimization
- A simpler approach to matrix completion
- Convex multi-task feature learning
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- The fast Johnson-Lindenstrauss transform and approximate nearest neighbors
- Guarantees of Riemannian optimization for low rank matrix recovery
- Compressive Multiplexing of Correlated Signals
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- Low-rank approximation algorithms for matrix completion with random sampling
- Matrix completion with sparse measurement errors
Cited In (1)
This page was built for publication: Constructing low-rank Tucker tensor approximations using generalized completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6489425)