Random points are good for universal discretization (Q6074492)

From MaRDI portal
Revision as of 17:18, 26 April 2024 by Importer (talk | contribs) (‎Created a new Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article; zbMATH DE number 7739934
Language Label Description Also known as
English
Random points are good for universal discretization
scientific article; zbMATH DE number 7739934

    Statements

    Random points are good for universal discretization (English)
    0 references
    0 references
    0 references
    19 September 2023
    0 references
    In many areas of approximation theory and applications such as sampling, discretisation of integrals and especially of norms defined by integrals is essential. A key question is the distribution of the evaluation points that defined the finite sums which are employed to approximate the integral. In other words, one wants to know which distributions are optimal in some sense, minimising the number of points and maximising the accuracy. In this paper, very powerful results are presented which give bounds on the accuracy of these discretisations that are fulfilled with high probability. The interesting point is that these points are best chosen randomly. The discretisation problems can be formed as sampling discretisation where for a fixed (finite) dimensional function space upper and lower bounds of the discretised version of the norms are sought by the actual \(p\)-norms (so-called Marcinkiewicz-type discretisaton theorems). These upper and lower bounds have of course some fixed numerical factors that turn out to be \(1\mp\frac12\). On top of this, the article studies the problem of universal discretization where for a finite selection of fixed (finite) dimensional function spaces (\(k\) of them, say), upper and lower bounds of the discretised version of the norms are sought by the actual \(p\)-norms. Of course one is also interested in minimising the number of sampling points in the discretisation when the Marcinkiewicz-type discretisaton bounds are sought. The subspaces that are used are in this case subspaces generated by so-called dictionaries. The article's main and remarkable results provide lower bounds on the probability that the estimates are reached for upper and lower factors \(1\mp\frac12\) under the condition that the number of sampling points too satisfies a lower bound that depends on the size of the dictionary. A greedy (descent) method is also presented (in another main theorem) to give sparse recovery results.
    0 references
    sampling discretization
    0 references
    universality
    0 references
    recovery
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references