Tensor networks in machine learning (Q6160060): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Normalize DOI.
 
(One intermediate revision by one other user not shown)
Property / DOI
 
Property / DOI: 10.4171/mag/101 / rank
Normal rank
 
Property / cites work
 
Property / cites work: Physics, Topology, Logic and Computation: A Rosetta Stone / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tensor network methods for invariant theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5483032 / rank
 
Normal rank
Property / cites work
 
Property / cites work: 10.1162/153244302760200704 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Support-vector networks / rank
 
Normal rank
Property / cites work
 
Property / cites work: Enumeration of Seven-Argument Threshold Functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Finitely correlated states on quantum spin chains / rank
 
Normal rank
Property / cites work
 
Property / cites work: Frustration free gapless Hamiltonians for matrix product states / rank
 
Normal rank
Property / cites work
 
Property / cites work: A logical calculus of the ideas immanent in nervous activity / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tensor-Train Decomposition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5620163 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3522633 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Are Loss Functions All the Same? / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Survey of Graphical Languages for Monoidal Categories / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.4171/MAG/101 / rank
 
Normal rank

Latest revision as of 19:00, 30 December 2024

scientific article; zbMATH DE number 7683395
Language Label Description Also known as
English
Tensor networks in machine learning
scientific article; zbMATH DE number 7683395

    Statements

    Tensor networks in machine learning (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    9 May 2023
    0 references
    Summary: A tensor network is a type of decomposition used to express and approximate large arrays of data. A given dataset, quantum state, or higher-dimensional multilinear map is factored and approximated by a composition of smaller multilinear maps. This is reminiscent to how a Boolean function might be decomposed into a gate array: this represents a special case of tensor decomposition, in which the tensor entries are replaced by 0, 1 and the factorisation becomes exact. The associated techniques are called tensor network methods: the subject developed independently in several distinct fields of study, which have more recently become interrelated through the language of tensor networks. The tantamount questions in the field relate to expressability of tensor networks and the reduction of computational overheads. A merger of tensor networks with machine learning is natural. On the one hand, machine learning can aid in determining a factorisation of a tensor network approximating a data set. On the other hand, a given tensor network structure can be viewed as a machine learning model. Herein the tensor network parameters are adjusted to learn or classify a data-set. In this survey we review the basics of tensor networks and explain the ongoing effort to develop the theory of tensor networks in machine learning.
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references