MERACLE: constructive layer-wise conversion of a tensor train into a MERA

From MaRDI portal
Publication:2667352

DOI10.1007/S42967-020-00090-6zbMATH Open1476.15024arXiv1912.09775OpenAlexW3094313765MaRDI QIDQ2667352FDOQ2667352


Authors: Yanyan Li Edit this on Wikidata


Publication date: 24 November 2021

Published in: Communications on Applied Mathematics and Computation (Search for Journal in Brave)

Abstract: In this article two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and storage efficient algorithms. Both the multilinear Tucker-ranks as well as the MERA-ranks are automatically determined by the algorithm for a given upper bound on the relative approximation error. In addition, an iterative algorithm with low computational complexity based on solving an orthogonal Procrustes problem is proposed for the first time to retrieve optimal rank-lowering disentangler tensors, which are a crucial component in the construction of a low-rank MERA. Numerical experiments demonstrate the effectiveness of the proposed algorithms together with the potential storage benefit of a low-rank MERA over a tensor train.


Full work available at URL: https://arxiv.org/abs/1912.09775




Recommendations




Cites Work


Cited In (1)

Uses Software





This page was built for publication: MERACLE: constructive layer-wise conversion of a tensor train into a MERA

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2667352)