MERACLE: constructive layer-wise conversion of a tensor train into a MERA
From MaRDI portal
Publication:2667352
Abstract: In this article two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and storage efficient algorithms. Both the multilinear Tucker-ranks as well as the MERA-ranks are automatically determined by the algorithm for a given upper bound on the relative approximation error. In addition, an iterative algorithm with low computational complexity based on solving an orthogonal Procrustes problem is proposed for the first time to retrieve optimal rank-lowering disentangler tensors, which are a crucial component in the construction of a low-rank MERA. Numerical experiments demonstrate the effectiveness of the proposed algorithms together with the potential storage benefit of a low-rank MERA over a tensor train.
Recommendations
- Efficient tree decomposition of high-rank tensors
- Nonnegative tensor train factorization with DMRG technique
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- DMRG approach to fast linear algebra in the TT-format
- Multiresolution low-rank tensor formats
Cites work
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- A Multilinear Singular Value Decomposition
- A new scheme for the tensor representation
- A new truncation strategy for the higher-order singular value decomposition
- A note on tensor chain approximation
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Deflation Techniques for an Implicitly Restarted Arnoldi Iteration
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Hierarchical Singular Value Decomposition of Tensors
- Optimization problems in contracted tensor networks
- TT-cross approximation for multidimensional arrays
- Tensor Decomposition for Signal Processing and Machine Learning
- Tensor Decompositions and Applications
- Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
- Tensor-train decomposition
- The alternating linear scheme for tensor optimization in the tensor train format
- The density-matrix renormalization group in the age of matrix product states
- Two-level QTT-Tucker format for optimized tensor calculus
- \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling
This page was built for publication: MERACLE: constructive layer-wise conversion of a tensor train into a MERA
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2667352)