Nonlinear approximation and (deep) ReLU networks
DOI10.1007/s00365-021-09548-zzbMath1501.41003arXiv1905.02199OpenAlexW3160447895MaRDI QIDQ2117331
Simon Foucart, Ronald A. DeVore, Ingrid Daubechies, B. Hanin, Guergana Petrova
Publication date: 21 March 2022
Published in: Constructive Approximation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1905.02199
Artificial neural networks and deep learning (68T07) Neural networks for/in biological studies, artificial life and related topics (92B20) Neural nets applied to problems in time-dependent statistical mechanics (82C32) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items
Uses Software
Cites Work
- The Takagi function: a survey
- Weierstrass' function and chaos
- Multilayer feedforward networks are universal approximators
- Provable approximation properties for deep neural networks
- Optimal nonlinear approximation
- Wavelet compression and nonlinear \(n\)-widths
- Exponential convergence of the deep neural network approximation for analytic functions
- Error bounds for approximations with deep ReLU networks
- Deep vs. shallow networks: An approximation theory perspective
- Neural Networks for Localized Approximation
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Deep Network Approximation for Smooth Functions
- Deep Network Approximation Characterized by Number of Neurons
- Approximation by superpositions of a sigmoidal function
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item