Streaming Tensor Train Approximation
From MaRDI portal
Publication:6074548
Abstract: Tensor trains are a versatile tool to compress and work with high-dimensional data and functions. In this work we introduce the Streaming Tensor Train Approximation (STTA), a new class of algorithms for approximating a given tensor in the tensor train format. STTA accesses exclusively via two-sided random sketches of the original data, making it streamable and easy to implement in parallel -- unlike existing deterministic and randomized tensor train approximations. This property also allows STTA to conveniently leverage structure in , such as sparsity and various low-rank tensor formats, as well as linear combinations thereof. When Gaussian random matrices are used for sketching, STTA is admissible to an analysis that builds and extends upon existing results on the generalized Nystr"om approximation for matrices. Our results show that STTA can be expected to attain a nearly optimal approximation error if the sizes of the sketches are suitably chosen. A range of numerical experiments illustrates the performance of STTA compared to existing deterministic and randomized approaches.
Recommendations
- Low-rank Tucker approximation of a tensor from streaming data
- Parallel Algorithms for Computing the Tensor-Train Decomposition
- SOTT: greedy approximation of a tensor as a sum of tensor trains
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- A continuous analogue of the tensor-train decomposition
Cites work
- A literature survey of low-rank tensor approximation techniques
- A practical introduction to tensor networks: Matrix product states and projected entangled pair states
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- Numerical linear algebra in the streaming model
- Parallel Algorithms for Computing the Tensor-Train Decomposition
- Practical sketching algorithms for low-rank matrix approximation
- Randomized Algorithms for Rounding in the Tensor-Train Format
- Randomized algorithms for the approximations of Tucker and the tensor train decompositions
- Tensor Decompositions and Applications
- Tensor Spaces and Numerical Tensor Calculus
- Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations
- Tensor numerical methods in scientific computing
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- Tensor-train decomposition
- The density-matrix renormalization group in the age of matrix product states
Cited in
(14)- SVD-based algorithms for fully-connected tensor network decomposition
- An algorithm for arbitrary-order cumulant tensor calculation in a sliding window of data streams
- SOTT: greedy approximation of a tensor as a sum of tensor trains
- Approximation in the extended functional tensor train format
- A continuous analogue of the tensor-train decomposition
- An Incremental Tensor Train Decomposition Algorithm
- A sequential multilinear Nyström algorithm for streaming low-rank approximation of tensors in Tucker format
- SVD-based algorithms for tensor wheel decomposition
- Low-rank Tucker approximation of a tensor from streaming data
- Randomized Algorithms for Rounding in the Tensor-Train Format
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- Efficient construction of tensor ring representations from sampling
- Tracking tensor ring decompositions of streaming tensors
- Randomized tensor wheel decomposition
This page was built for publication: Streaming Tensor Train Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6074548)