Approximation of high-dimensional rank one tensors (Q485311)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Approximation of high-dimensional rank one tensors
scientific article

    Statements

    Approximation of high-dimensional rank one tensors (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    9 January 2015
    0 references
    It is known that the approximation of smoothness classes of functions suffers from the so-called `curse of dimensionality'. Avoiding this curse requires new model classes for real world functions that match applications. This has led to the introduction of notions such as sparsity, variable reduction, and reduced modeling. One theme that is particularly common is to assume a tensor structure for the target function. This paper investigates how well a rank one function \(f (x_1,\dots,x_d ) = f_1(x_1)\cdots f_d (x_d ),\) defined on \(\Omega = [0, 1]^d\) can be captured through point queries. It is shown that such a rank one function with component functions \(f_j\) in \(W_\infty^r([0, 1])\) can be captured (in \(L_\infty\)) to accuracy \(O(C(d,r)N^{-r})\) from \(N\) well chosen point evaluations. The constant \(C(d,r)\) scales like \(d^{dr}\). The queries in the algorithms discussed have two ingredients, a set of points built on the results from discrepancy theory and a second adaptive set of queries dependent on the information drawn from the first set. Under the assumption that a point \(z\in\Omega\) with non-vanishing \(f (z)\) is known, the accuracy improves to \(O(dN^{-r}).\)
    0 references
    0 references
    0 references
    0 references
    0 references
    query algorithms
    0 references
    high-dimensional approximation
    0 references
    separable functions
    0 references
    rate of approximation
    0 references
    0 references