Integral representations of shallow neural network with Rectified Power Unit activation function
From MaRDI portal
Publication:6386309
DOI10.1016/j.neunet.2022.09.005arXiv2112.11157WikidataQ114145524 ScholiaQ114145524MaRDI QIDQ6386309
Ahmed Abdeljawad, Philipp Grohs
Publication date: 20 December 2021
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Support theorems for the Radon transform and Cramér-Wold theorems
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Asymptotic formulas for the dual Radon transform and applications
- Harmonic analysis of neural networks
- The Radon transform.
- Multilayer feedforward networks are universal approximators
- Complexity estimates based on integral transforms induced by computational units
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- A Sobolev-type upper bound for rates of approximation by linear combinations of Heaviside plane waves
- Integral combinations of Heavisides
- Universal approximation bounds for superpositions of a sigmoidal function
- New range theorems for the dual Radon transform
- A Course in Analysis
- Deep Neural Network Approximation Theory
- Optimal Approximation with Sparsely Connected Deep Neural Networks
This page was built for publication: Integral representations of shallow neural network with Rectified Power Unit activation function