Fundamental tensor operations for large-scale data analysis using tensor network formats

From MaRDI portal
Publication:784596

DOI10.1007/S11045-017-0481-0zbMATH Open1448.94107arXiv1405.7786OpenAlexW2593392256MaRDI QIDQ784596FDOQ784596


Authors: Namgil Lee, Andrzej Cichocki Edit this on Wikidata


Publication date: 3 August 2020

Published in: Multidimensional Systems and Signal Processing (Search for Journal in Brave)

Abstract: We discuss extended definitions of linear and multilinear operations such as Kronecker, Hadamard, and contracted products, and establish links between them for tensor calculus. Then we introduce effective low-rank tensor approximation techniques including Candecomp/Parafac (CP), Tucker, and tensor train (TT) decompositions with a number of mathematical and graphical representations. We also provide a brief review of mathematical properties of the TT decomposition as a low-rank approximation technique. With the aim of breaking the curse-of-dimensionality in large-scale numerical analysis, we describe basic operations on large-scale vectors, matrices, and high-order tensors represented by TT decomposition. The proposed representations can be used for describing numerical methods based on TT decomposition for solving large-scale optimization problems such as systems of linear equations and symmetric eigenvalue problems.


Full work available at URL: https://arxiv.org/abs/1405.7786




Recommendations




Cites Work


Cited In (26)

Uses Software





This page was built for publication: Fundamental tensor operations for large-scale data analysis using tensor network formats

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q784596)