On polynomial time methods for exact low-rank tensor completion
From MaRDI portal
Abstract: In this paper, we investigate the sample size requirement for exact recovery of a high order tensor of low rank from a subset of its entries. We show that a gradient descent algorithm with initial value obtained from a spectral method can, in particular, reconstruct a tensor of multilinear ranks with high probability from as few as entries. In the case when the ranks , our sample size requirement matches those for nuclear norm minimization (Yuan and Zhang, 2016a), or alternating least squares assuming orthogonal decomposability (Jain and Oh, 2014). Unlike these earlier approaches, however, our method is efficient to compute, easy to implement, and does not impose extra structures on the tensor. Numerical results are presented to further demonstrate the merits of the proposed approach.
Recommendations
Cites work
- scientific article; zbMATH DE number 1254560 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A Bennett concentration inequality and its application to suprema of empirical processes
- A Newton-Grassmann method for computing the best multilinear rank-\((r_1,r_2,r_3)\) approximation of a tensor
- A simpler approach to matrix completion
- A useful variant of the Davis-Kahan theorem for statisticians
- Decoupling inequalities for the tail probabilities of multivariate \(U\)- statistics
- Exact matrix completion via convex optimization
- Incoherent Tensor Norms and Their Applications in Higher Order Tensor Completion
- Linear and nonlinear programming.
- Low rank tensor recovery via iterative hard thresholding
- Low-rank tensor completion by Riemannian optimization
- Matrix Completion From a Few Entries
- Most tensor problems are NP-hard
- Noisy tensor completion via the sum-of-squares hierarchy
- On tensor completion via nuclear norm minimization
- Quasi-Newton methods on Grassmannians and multilinear approximations of tensors
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Spectral algorithms for tensor completion
- Tensor Algebra and Multidimensional Harmonic Retrieval in Signal Processing for MIMO Radar
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Tensor decompositions for learning latent variable models
- Tensor theta norms and low rank recovery
- Tensor-Based Formulation and Nuclear Norm Regularization for Multienergy Computed Tomography
- The Geometry of Algorithms with Orthogonality Constraints
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- User-friendly tail bounds for sums of random matrices
Cited in
(28)- Deterministic tensor completion with hypergraph expanders
- Covariate-Assisted Sparse Tensor Completion
- ISLET: fast and optimal low-rank tensor regression via importance sketching
- Tensor completion by multi-rank via unitary transformation
- The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising
- Near-optimal sample complexity for convex tensor completion
- Noisy tensor completion via the sum-of-squares hierarchy
- Spectral algorithms for tensor completion
- Recovering orthogonal tensors under arbitrarily strong, but locally correlated, noise
- An optimal statistical and computational framework for generalized tensor estimation
- Inference for low-rank tensors -- no need to debias
- Low-rank approximation and completion of positive tensors
- Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization
- Statistically optimal and computationally efficient low rank tensor completion from noisy entries
- Variants of alternating least squares tensor completion in the tensor train format
- Latent Space Model for Higher-Order Networks and Generalized Tensor Decomposition
- Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees
- Deterministic and Probabilistic Conditions for Finite Completability of Low-Tucker-Rank Tensor
- An approximation method of CP rank for third-order tensor completion
- On tensor completion via nuclear norm minimization
- Robust Low-Rank Tensor Decomposition with the L 2 Criterion
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Constructing low-rank Tucker tensor approximations using generalized completion
- Factor Models for High-Dimensional Tensor Time Series
- Community detection on mixture multilayer networks via regularized tensor decomposition
- Cross: efficient low-rank tensor completion
- Statistical inference for structured high-dimensional models. Abstracts from the workshop held March 11--17, 2018
- Riemannian conjugate gradient descent method for fixed multi rank third-order tensor completion
This page was built for publication: On polynomial time methods for exact low-rank tensor completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2007852)