Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions
From MaRDI portal
Publication:1734693
DOI10.1016/j.jco.2018.09.002zbMath1409.41008arXiv1709.08174OpenAlexW2850955080WikidataQ129179581 ScholiaQ129179581MaRDI QIDQ1734693
Publication date: 27 March 2019
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.08174
Related Items (4)
An analysis of training and generalization errors in shallow and deep networks ⋮ Rates of approximation by ReLU shallow neural networks ⋮ Function approximation by deep networks ⋮ Local approximation of operators
Cites Work
- Marcinkiewicz-Zygmund measures on manifolds
- Polynomial operators and local smoothness classes on the unit interval
- On the tractability of multivariate integration and approximation by neural networks
- Eignets for function approximation on manifolds
- Approximation by superposition of sigmoidal and radial basis functions
- On the representation of smooth functions on the sphere using finitely many bits
- Approximation properties of zonal function networks using scattered data on the sphere
- Polynomial operators and local approximation of solutions of pseudo-differential equations on the sphere
- Weighted quadrature formulas and approximation by zonal function networks on the sphere
- Deep vs. shallow networks: An approximation theory perspective
- Universal approximation bounds for superpositions of a sigmoidal function
- Dimension-independent bounds on the degree of approximation by neural networks
- Localized Linear Polynomial Operators and Quadrature Formulas on the Sphere
- Approximation by superpositions of a sigmoidal function
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions