MERACLE: constructive layer-wise conversion of a tensor train into a MERA
From MaRDI portal
Publication:2667352
DOI10.1007/S42967-020-00090-6zbMATH Open1476.15024arXiv1912.09775OpenAlexW3094313765MaRDI QIDQ2667352FDOQ2667352
Authors: Yanyan Li
Publication date: 24 November 2021
Published in: Communications on Applied Mathematics and Computation (Search for Journal in Brave)
Abstract: In this article two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and storage efficient algorithms. Both the multilinear Tucker-ranks as well as the MERA-ranks are automatically determined by the algorithm for a given upper bound on the relative approximation error. In addition, an iterative algorithm with low computational complexity based on solving an orthogonal Procrustes problem is proposed for the first time to retrieve optimal rank-lowering disentangler tensors, which are a crucial component in the construction of a low-rank MERA. Numerical experiments demonstrate the effectiveness of the proposed algorithms together with the potential storage benefit of a low-rank MERA over a tensor train.
Full work available at URL: https://arxiv.org/abs/1912.09775
Recommendations
- Efficient tree decomposition of high-rank tensors
- Nonnegative tensor train factorization with DMRG technique
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- DMRG approach to fast linear algebra in the TT-format
- Multiresolution low-rank tensor formats
Factorization of matrices (15A23) Multilinear algebra, tensor calculus (15A69) Numerical linear algebra (65F99)
Cites Work
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Title not available (Why is that?)
- Tensor Decompositions and Applications
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Tensor-train decomposition
- TT-cross approximation for multidimensional arrays
- Hierarchical Singular Value Decomposition of Tensors
- A Multilinear Singular Value Decomposition
- Deflation Techniques for an Implicitly Restarted Arnoldi Iteration
- The density-matrix renormalization group in the age of matrix product states
- Optimization problems in contracted tensor networks
- A new scheme for the tensor representation
- Two-level QTT-Tucker format for optimized tensor calculus
- The alternating linear scheme for tensor optimization in the tensor train format
- \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling
- A note on tensor chain approximation
- Tensor Decomposition for Signal Processing and Machine Learning
- A new truncation strategy for the higher-order singular value decomposition
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
- Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
Cited In (1)
Uses Software
This page was built for publication: MERACLE: constructive layer-wise conversion of a tensor train into a MERA
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2667352)