High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (Q2118396)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: High-order approximation rates for shallow neural networks with cosine and ReLU^k activation functions |
scientific article
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions |
scientific article |
Statements
High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions (English)
0 references
22 March 2022
0 references
neural networks
0 references
approximation rates
0 references
approximation lower bounds
0 references
finite element methods
0 references
0 references
0.8449186086654663
0 references
0.822880208492279
0 references
0.8104014992713928
0 references
0.8075194358825684
0 references
0.8065944314002991
0 references