Nonlinear approximation and (deep) ReLU networks (Q2117331): Difference between revisions

From MaRDI portal
Changed an Item
Importer (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W3160447895 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1905.02199 / rank
 
Normal rank

Latest revision as of 02:04, 19 April 2024

scientific article
Language Label Description Also known as
English
Nonlinear approximation and (deep) ReLU networks
scientific article

    Statements

    Nonlinear approximation and (deep) ReLU networks (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    21 March 2022
    0 references
    The authors address the approximation power of ReLU (rectified linear unit) networks and investigate whether such networks really provide a more powerful approximation efficiency than classical methods of approximation. The discussion of this question takes place in the univariate setting as there one has the best chance to obtain definitive results. The authors focus on the advantages of depth, i.e., what advantages are present in deep networks that do not appear in shallow networks.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    neural networks
    0 references
    rectified linear unit (ReLU)
    0 references
    expressiveness
    0 references
    approximation power
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references