Online subspace learning and imputation by tensor-ring decomposition
From MaRDI portal
Publication:6488720
DOI10.1016/J.NEUNET.2022.05.023WikidataQ114145547 ScholiaQ114145547MaRDI QIDQ6488720
Jinshi Yu, Tao Zou, Guoxu Zhou
Publication date: 17 October 2023
Published in: Neural Networks (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Tensor Decompositions and Applications
- Canonical polyadic decomposition of third-order tensors: relaxed uniqueness conditions and algebraic algorithm
- Low-rank tensor completion by Riemannian optimization
- An efficient matrix bi-factorization alternative optimization method for low-rank matrix recovery and completion
- Tensor \(N\)-tubal rank and its convex relaxation for low-rank tensor recovery
- Multiple graphs learning with a new weighted tensor nuclear norm
- Low-rank tensor constrained co-regularized multi-view spectral clustering
- Hybrid tensor decomposition in neural network compression
- Manifold regularized matrix completion for multi-label learning with ADMM
- Tensor completion and low-n-rank tensor recovery via convex optimization
- New Uniqueness Conditions for the Canonical Polyadic Decomposition of Third-Order Tensors
- Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation
- Subspace Learning and Imputation for Streaming Big Data Matrices and Tensors
- Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
- An Iterative Reweighted Method for Tucker Decomposition of Incomplete Tensors
- Smooth PARAFAC Decomposition for Tensor Completion
- Double Coupled Canonical Polyadic Decomposition for Joint Blind Source Separation
This page was built for publication: Online subspace learning and imputation by tensor-ring decomposition