Low-rank approximation and completion of positive tensors
From MaRDI portal
Publication:2827064
Abstract: Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-posed) decomposition for positive tensors, which we show is equivalent to the standard tensor decomposition in important cases. Though computing this decomposition is a nonconvex optimization problem, we prove it can be exactly reformulated as a convex optimization problem. This allows us to construct polynomial-time randomized algorithms for computing this decomposition and for solving low-rank tensor approximation problems. Among the consequences is that best rank-1 approximations of positive tensors can be computed in polynomial time. Our framework is next extended to the tensor completion problem, where noisy entries of a tensor are observed and then used to estimate missing entries. We provide a polynomial-time algorithm that for specific cases requires a polynomial (in tensor order) number of measurements, in contrast to existing approaches that require an exponential number of measurements. These algorithms are extended to exploit sparsity in the tensor to reduce the number of measurements needed. We conclude by providing a novel interpretation of statistical regression problems with categorical variables as tensor completion problems, and numerical examples with synthetic data and data from a bioengineered metabolic network show the improved performance of our approach on this problem.
Recommendations
- On tensor completion via nuclear norm minimization
- Riemannian optimization for high-dimensional tensor completion
- On polynomial time methods for exact low-rank tensor completion
- Recovering orthogonal tensors under arbitrarily strong, but locally correlated, noise
- Low rank tensor recovery via iterative hard thresholding
- Tensor completion in hierarchical tensor representations
- Nonconvex Low-Rank Tensor Completion from Noisy Data
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Noisy tensor completion via the sum-of-squares hierarchy
- Statistical mechanics of low-rank tensor decomposition
Cites work
- scientific article; zbMATH DE number 5968745 (Why is no real title available?)
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- 10.1162/153244303321897690
- A Direct Formulation for Sparse PCA Using Semidefinite Programming
- A Singular Value Thresholding Algorithm for Matrix Completion
- A mathematical view of interior-point methods in convex optimization
- A new convergence proof for the higher-order power method and generalizations
- A survey of cross-validation procedures for model selection
- Algebraic algorithms for sampling from conditional distributions
- Better Bootstrap Confidence Intervals
- Concentration inequalities. A nonasymptotic theory of independence
- Covariance regularization by thresholding
- Diagonal and low-rank matrix decompositions, correlation matrices, and ellipsoid fitting
- Expressing combinatorial optimization problems by linear programs
- Fixed point and Bregman iterative methods for matrix rank minimization
- Graph implementations for nonsmooth convex programs
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Lectures on algebraic statistics
- Matrix Completion From a Few Entries
- Matrix estimation by universal singular value thresholding
- Most tensor problems are NP-hard
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- On Tensors, Sparsity, and Nonnegative Factorizations
- On the Optimality of Conditional Expectation as a Bregman Predictor
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Positive tensor factorization
- Regression on manifolds: estimation of the exterior derivative
- Robust low-rank tensor recovery: models and algorithms
- Robust principal component analysis?
- Semidefinite relaxations for best rank-1 tensor approximations
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Tensor Decompositions and Applications
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Tensor completion in hierarchical tensor representations
- The Optimal Hard Threshold for Singular Values is <inline-formula> <tex-math notation="TeX">\(4/\sqrt {3}\) </tex-math></inline-formula>
- Variational Analysis
- Worst-case and smoothed analysis of \(k\)-means clustering with Bregman divergences
Cited in
(13)- Multilayer tensor factorization with applications to recommender systems
- On polynomial time methods for exact low-rank tensor completion
- Spectral algorithms for tensor completion
- Positive tensor factorization
- Inverse optimization with noisy data
- On Best Low Rank Approximation of Positive Definite Tensors
- Tensor Dictionary Learning for Positive Definite Matrices
- Approximation algorithms for tensor clustering
- Tensor principal component analysis via convex optimization
- Relative error tensor low rank approximation
- Deterministic and Probabilistic Conditions for Finite Completability of Low-Tucker-Rank Tensor
- Learning with tensors: a framework based on convex optimization and spectral regularization
- Fiber sampling approach to canonical polyadic decomposition and application to tensor completion
This page was built for publication: Low-rank approximation and completion of positive tensors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2827064)