Theory of deep convolutional neural networks. III: Approximating radial functions
From MaRDI portal
Publication:6055154
DOI10.1016/j.neunet.2021.09.027zbMath1521.68193arXiv2107.00896OpenAlexW3202429688MaRDI QIDQ6055154
No author found.
Publication date: 28 September 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2107.00896
rates of approximationradial functionsconvolutional neural networksdeep learninggeneralization analysis
Artificial neural networks and deep learning (68T07) Rate of convergence, degree of approximation (41A25)
Related Items (7)
Rates of approximation by ReLU shallow neural networks ⋮ Shared subspace-based radial basis function neural network for identifying ncRNAs subcellular localization ⋮ Error analysis of kernel regularized pairwise learning with a strongly convex loss ⋮ Deep learning theory of distribution regression with CNNs ⋮ Learning ability of interpolating deep convolutional neural networks ⋮ Approximation of functions from korobov spaces by deep convolutional neural networks ⋮ Approximating functions with multi-features by deep convolutional neural networks
Cites Work
- Unnamed Item
- Consistency analysis of an empirical minimum error entropy algorithm
- On best approximation by ridge functions
- Fundamentality of ridge functions
- Provable approximation properties for deep neural networks
- Distributed kernel-based gradient descent algorithms
- Approximation properties of a multilayered feedforward artificial neural network
- Theory of deep convolutional neural networks. II: Spherical analysis
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Theory of deep convolutional neural networks: downsampling
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- Approximation of Sobolev classes by polynomials and ridge functions
- Neural network with unbounded activation functions is universal approximator
- Deep vs. shallow networks: An approximation theory perspective
- Learning Theory
- Universal approximation bounds for superpositions of a sigmoidal function
- Deep distributed convolutional neural networks: Universality
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Theoretical issues in deep networks
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Deep neural networks for rotation-invariance approximation and learning
- Thresholded spectral algorithms for sparse approximations
- Nonparametric Regression Based on Hierarchical Interaction Models
- A Fast Learning Algorithm for Deep Belief Nets
This page was built for publication: Theory of deep convolutional neural networks. III: Approximating radial functions