Approximation properties of deep ReLU CNNs
From MaRDI portal
Publication:2157922
DOI10.1007/s40687-022-00336-0OpenAlexW3198891595WikidataQ114218938 ScholiaQ114218938MaRDI QIDQ2157922
Juncai He, Lin Li, Jin-Chao Xu
Publication date: 22 July 2022
Published in: Research in the Mathematical Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2109.00190
Artificial neural networks and deep learning (68T07) Approximation by other special function classes (41A30) Numerical approximation of high-dimensional functions; sparse grids (65D40)
Related Items
Cites Work
- Unnamed Item
- Multilayer feedforward networks are universal approximators
- Approximation rates for neural networks with general activation functions
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Nonlinear approximation via compositions
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- MgNet: a unified framework of multigrid and convolutional neural network
- Relu Deep Neural Networks and Linear Finite Elements
- Ten Lectures on Wavelets
- Universal approximation bounds for superpositions of a sigmoidal function
- Deep distributed convolutional neural networks: Universality
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Universal Consistency of Deep Convolutional Neural Networks
- Deep ReLU networks and high-order finite element methods
- Error bounds for approximations with deep ReLU neural networks in Ws,p norms
- Finite Neuron Method and Convergence Analysis
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Approximation by superpositions of a sigmoidal function