On polynomial time methods for exact low-rank tensor completion (Q2007852)

From MaRDI portal
scientific article
Language Label Description Also known as
English
On polynomial time methods for exact low-rank tensor completion
scientific article

    Statements

    On polynomial time methods for exact low-rank tensor completion (English)
    0 references
    0 references
    0 references
    22 November 2019
    0 references
    The goal of tensor completion is to recover a tensor from partial observations of its entries. Although the second-order (matrix) case is in general NP-hard, it is possible to develop tractable algorithms to achieve exact recovery with high probability. The question is, if the same can be said for higher-order tensors. The goal of this paper is to develop a computationally efficient approach with a tight sample size requirement for completing a third-order tensors that are of low multilinear ranks. The authors show that a gradient descent algorithm with initial value obtained from a new spectral method can reconstruct a \(d_1 \times d_2 \times d_3\) tensor with multilinear ranks \((r_1,r_2,r_3)\) with high probability from as few as \(O\left(r_1r_2r_3(rd_1d_2d_3)^{1/2}\log^{7/2}d + (r_1r_2r_3)^2rd\log^6d\right)\) entries, where \(r=\max\{r_1,r_2,r_3\}\) and \(d=\max\{d_1,d_2,d_3\}\). The authors are primarily interested in high-dimensional (large \(d\)) and low-rank (small \(r\)) instances. The new second-order spectral method guarantees to produce an initial value sufficiently close to the optimal value. Both algorithms, the gradient descent algorithm and the second-order spectral algorithm, are of polynomial time complexity. The total computational cost of the method depends on the convergence rate of the gradient descent algorithm. Numerical experiments suggest a linear convergence rate. The framework is established for third-order tensors but it can be naturally extended to higher order tensors.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    tensor completion
    0 references
    polynomial time complexity
    0 references
    matrix completion
    0 references
    concentration inequality
    0 references
    convex optimization
    0 references
    gradient descent algorithm
    0 references
    spectral algorithm
    0 references
    Grassmannian
    0 references
    0 references
    0 references
    0 references
    0 references