Nonlinear approximation and (deep) ReLU networks (Q2117331)

From MaRDI portal





scientific article
Language Label Description Also known as
default for all languages
No label defined
    English
    Nonlinear approximation and (deep) ReLU networks
    scientific article

      Statements

      Nonlinear approximation and (deep) ReLU networks (English)
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      21 March 2022
      0 references
      The authors address the approximation power of ReLU (rectified linear unit) networks and investigate whether such networks really provide a more powerful approximation efficiency than classical methods of approximation. The discussion of this question takes place in the univariate setting as there one has the best chance to obtain definitive results. The authors focus on the advantages of depth, i.e., what advantages are present in deep networks that do not appear in shallow networks.
      0 references
      neural networks
      0 references
      rectified linear unit (ReLU)
      0 references
      expressiveness
      0 references
      approximation power
      0 references
      0 references
      0 references
      0 references

      Identifiers

      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references