High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (Q2118396): Difference between revisions

From MaRDI portal
ReferenceBot (talk | contribs)
Changed an Item
Import241208061232 (talk | contribs)
Normalize DOI.
 
Property / DOI
 
Property / DOI: 10.1016/j.acha.2021.12.005 / rank
Normal rank
 
Property / DOI
 
Property / DOI: 10.1016/J.ACHA.2021.12.005 / rank
 
Normal rank

Latest revision as of 03:10, 17 December 2024

scientific article
Language Label Description Also known as
English
High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
scientific article

    Statements

    High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (English)
    0 references
    0 references
    0 references
    22 March 2022
    0 references
    neural networks
    0 references
    approximation rates
    0 references
    approximation lower bounds
    0 references
    finite element methods
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references