Equivalence of approximation by convolutional neural networks and fully-connected networks
DOI10.1090/proc/14789OpenAlexW2982376398WikidataQ126860813 ScholiaQ126860813MaRDI QIDQ5218202
Philipp Petersen, Felix Voigtlaender
Publication date: 2 March 2020
Published in: Proceedings of the American Mathematical Society (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1809.00973
Convolution as an integral transform (44A35) Rate of convergence, degree of approximation (41A25) Approximation by operators (in particular, by integral operators) (41A35) Algorithms for approximation of functions (65D15) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items (19)
Cites Work
- Unnamed Item
- Unnamed Item
- Approximation by superposition of sigmoidal and radial basis functions
- Lower bounds for approximation by MLP neural networks
- Multilayer feedforward networks are universal approximators
- Provable approximation properties for deep neural networks
- Approximation properties of a multilayered feedforward artificial neural network
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Error bounds for approximations with deep ReLU networks
- Universal approximation bounds for superpositions of a sigmoidal function
- Deep distributed convolutional neural networks: Universality
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Equivalence of approximation by convolutional neural networks and fully-connected networks