Block tensor train decomposition for missing data estimation
From MaRDI portal
Recommendations
- Low-rank tensor completion using matrix factorization based on tensor train rank and total variation
- Tucker factorization with missing data with application to low-n-rank tensor completion
- Tensor train rank minimization with nonlocal self-similarity for tensor completion
- Tensor factorization with total variation for internet traffic data imputation
- Majorized proximal alternating imputation for regularized rank constrained matrix completion
Cites work
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- A literature survey of low-rank tensor approximation techniques
- A new scheme for the tensor representation
- Alternating minimal energy methods for linear systems in higher dimensions
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
- Computation of extreme eigenvalues in higher dimensions using block tensor train format
- Computing low-rank approximations of large-scale matrices with the tensor network randomized SVD
- Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
- Estimating a few extreme singular values and vectors for large-scale matrices in tensor train format
- Exact matrix completion via convex optimization
- Fast solution of parabolic problems in the tensor train/quantized tensor train format with initial application to the Fokker-Planck equation
- Flexible imputation of missing data.
- Fundamental tensor operations for large-scale data analysis using tensor network formats
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Hierarchical Singular Value Decomposition of Tensors
- Low-rank matrix approximation with weights or missing data is NP-hard
- Low-rank tensor completion by Riemannian optimization
- Low-rank tensor methods with subspace correction for symmetric eigenvalue problems
- Lower Rank Approximation of Matrices by Least Squares with Any Choice of Weights
- Nuclear norm of higher-order tensors
- Numerical methods for large eigenvalue problems
- On manifolds of tensors of fixed TT-rank
- On minimal subspaces in tensor representations
- On tensor completion via nuclear norm minimization
- Principal component analysis.
- Recommender Systems Handbook
- Regularized computation of approximate pseudoinverse of large matrices using low-rank tensor train decompositions
- Riemannian optimization for high-dimensional tensor completion
- Smooth PARAFAC Decomposition for Tensor Completion
- Solution of Linear Systems and Matrix Inversion in the TT-Format
- Spectral regularization algorithms for learning large incomplete matrices
- Tensor Decompositions and Applications
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Tensor completion in hierarchical tensor representations
- Tensor conjugate-gradient-type method for Rayleigh quotient minimization in block QTT-format
- Tensor spaces and numerical tensor calculus
- Tensor-train decomposition
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- The alternating linear scheme for tensor optimization in the tensor train format
- Tucker factorization with missing data with application to low-n-rank tensor completion
- Variants of alternating least squares tensor completion in the tensor train format
- Weighted least squares fitting using ordinary least squares algorithms
- \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling
This page was built for publication: Block tensor train decomposition for missing data estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1757240)