The construction and approximation of ReLU neural network operators
From MaRDI portal
Publication:2086452
DOI10.1155/2022/1713912OpenAlexW4297453899MaRDI QIDQ2086452
Hengjie Chen, Dansheng Yu, Zhong Li
Publication date: 25 October 2022
Published in: Journal of Function Spaces (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2022/1713912
Cites Work
- Unnamed Item
- Unnamed Item
- Univariate hyperbolic tangent neural network approximation
- The approximation operators with sigmoidal functions
- Approximation by Ridge functions and neural networks with one hidden layer
- Rate of convergence of some neural network operators to the unit-univariate case
- Neural network operators: constructive interpolation of multivariate functions
- Multilayer feedforward networks are universal approximators
- Provable approximation properties for deep neural networks
- An approximation by neural networks with a fixed weight
- Degree of approximation by neural and translation networks with a single hidden layer
- Rates of approximation by neural network interpolation operators
- Nonlinear approximation and (deep) ReLU networks
- Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions
- On the approximation by single hidden layer feedforward neural networks with fixed weights
- Interpolation by neural network operators activated by ramp functions
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- Universal approximation bounds for superpositions of a sigmoidal function
- A note on the expressive power of deep rectified linear unit networks in high‐dimensional spaces
- Approximation by superpositions of a sigmoidal function
This page was built for publication: The construction and approximation of ReLU neural network operators