Nonlinear approximation and (deep) ReLU networks (Q2117331): Difference between revisions

From MaRDI portal
Changed an Item
Changed an Item
Property / describes a project that uses
 
Property / describes a project that uses: AlexNet / rank
 
Normal rank

Revision as of 08:53, 28 February 2024

scientific article
Language Label Description Also known as
English
Nonlinear approximation and (deep) ReLU networks
scientific article

    Statements

    Nonlinear approximation and (deep) ReLU networks (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    21 March 2022
    0 references
    The authors address the approximation power of ReLU (rectified linear unit) networks and investigate whether such networks really provide a more powerful approximation efficiency than classical methods of approximation. The discussion of this question takes place in the univariate setting as there one has the best chance to obtain definitive results. The authors focus on the advantages of depth, i.e., what advantages are present in deep networks that do not appear in shallow networks.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    neural networks
    0 references
    rectified linear unit (ReLU)
    0 references
    expressiveness
    0 references
    approximation power
    0 references
    0 references
    0 references