Nonlinear approximation and (deep) ReLU networks (Q2117331): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(5 intermediate revisions by 4 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: AlexNet / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: GNMT / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W3160447895 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1905.02199 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Takagi function: a survey / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal Approximation with Sparsely Connected Deep Neural Networks / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural Networks for Localized Approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Approximation by superpositions of a sigmoidal function / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4215356 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal nonlinear approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Wavelet compression and nonlinear \(n\)-widths / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4273944 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exponential convergence of the deep neural network approximation for analytic functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3807471 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multilayer feedforward networks are universal approximators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Deep Network Approximation for Smooth Functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Deep vs. shallow networks: An approximation theory perspective / rank
 
Normal rank
Property / cites work
 
Property / cites work: Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ / rank
 
Normal rank
Property / cites work
 
Property / cites work: Provable approximation properties for deep neural networks / rank
 
Normal rank
Property / cites work
 
Property / cites work: Deep Network Approximation Characterized by Number of Neurons / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4800024 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Weierstrass' function and chaos / rank
 
Normal rank
Property / cites work
 
Property / cites work: Error bounds for approximations with deep ReLU networks / rank
 
Normal rank

Latest revision as of 10:11, 28 July 2024

scientific article
Language Label Description Also known as
English
Nonlinear approximation and (deep) ReLU networks
scientific article

    Statements

    Nonlinear approximation and (deep) ReLU networks (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    21 March 2022
    0 references
    The authors address the approximation power of ReLU (rectified linear unit) networks and investigate whether such networks really provide a more powerful approximation efficiency than classical methods of approximation. The discussion of this question takes place in the univariate setting as there one has the best chance to obtain definitive results. The authors focus on the advantages of depth, i.e., what advantages are present in deep networks that do not appear in shallow networks.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    neural networks
    0 references
    rectified linear unit (ReLU)
    0 references
    expressiveness
    0 references
    approximation power
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references