High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (Q2118396): Difference between revisions
From MaRDI portal
Removed claim: author (P16): Item:Q1203429 |
Changed an Item |
||
Property / author | |||
Property / author: Jin-Chao Xu / rank | |||
Normal rank |
Revision as of 14:09, 22 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions |
scientific article |
Statements
High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (English)
0 references
22 March 2022
0 references
neural networks
0 references
approximation rates
0 references
approximation lower bounds
0 references
finite element methods
0 references