Theory of deep convolutional neural networks. II: Spherical analysis
From MaRDI portal
Publication:2057723
Abstract: Deep learning based on deep neural networks of various structures and architectures has been powerful in many practical applications, but it lacks enough theoretical verifications. In this paper, we consider a family of deep convolutional neural networks applied to approximate functions on the unit sphere of . Our analysis presents rates of uniform approximation when the approximated function lies in the Sobolev space with or takes an additive ridge form. Our work verifies theoretically the modelling and approximation ability of deep convolutional neural networks followed by downsampling and one fully connected layer or two. The key idea of our spherical analysis is to use the inner product form of the reproducing kernels of the spaces of spherical harmonics and then to apply convolutional factorizations of filters to realize the generated linear features.
Recommendations
- Theory of deep convolutional neural networks. III: Approximating radial functions
- A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
- Convolutional neural networks analyzed via convolutional sparse coding
- Approximation Analysis of Convolutional Neural Networks
- Analysis of deep convolutional networks from group theory viewpoint
- Deep Neural Network Approximation Theory
- Convergence of deep convolutional neural networks
- Approximation spaces of deep neural networks
- A singular Riemannian geometry approach to deep neural networks. I: Theoretical foundations
- Convolutional Neural Networks in Phase Space and Inverse Problems
Cites work
- scientific article; zbMATH DE number 477682 (Why is no real title available?)
- A Fast Learning Algorithm for Deep Belief Nets
- A lower bound for the worst-case cubature error on spheres of arbitrary dimension
- Approximation Theory and Harmonic Analysis on Spheres and Balls
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula>
- Approximation properties of a multilayered feedforward artificial neural network
- Consistency analysis of an empirical minimum error entropy algorithm
- Deep distributed convolutional neural networks: universality
- Deep learning
- Distributed kernel-based gradient descent algorithms
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Error bounds for approximations with deep ReLU networks
- Fully discrete needlet approximation on the sphere
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Learning theory estimates via integral operators and their approximations
- Limitations of the approximation capabilities of neural networks with one hidden layer
- Neural network with unbounded activation functions is universal approximator
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Optimal approximation with sparsely connected deep neural networks
- Provable approximation properties for deep neural networks
- The best approximation of the classes of functions \(W_ p^{\alpha}(S^ n)\) by polynomials in spherical harmonics
- Theory of deep convolutional neural networks: downsampling
- Thresholded spectral algorithms for sparse approximations
- Universal approximation bounds for superpositions of a sigmoidal function
- Universality of deep convolutional neural networks
Cited in
(22)- Universality of deep convolutional neural networks
- Error bounds for approximations using multichannel deep convolutional neural networks with downsampling
- Deep learning theory of distribution regression with CNNs
- Function space and critical points of linear convolutional networks
- Approximation of functions from korobov spaces by deep convolutional neural networks
- Analysis of convolutional neural network image classifiers in a hierarchical max-pooling model with additional local pooling
- Rates of approximation by ReLU shallow neural networks
- Moduli of smoothness, \(K\)-functionals and Jackson-type inequalities associated with Kernel function approximation in learning theory
- On the density of translation networks defined on the unit ball
- Geometry of linear convolutional networks
- Forward and inverse approximation theory for linear temporal convolutional networks
- Theory of deep convolutional neural networks. III: Approximating radial functions
- Approximating functions with multi-features by deep convolutional neural networks
- Learning and approximating piecewise smooth functions by deep sigmoid neural networks
- Theory of deep convolutional neural networks: downsampling
- Analysis of deep convolutional networks from group theory viewpoint
- THE CNN PARADIGM: SHAPES AND COMPLEXITY
- Solving PDEs on spheres with physics-informed convolutional neural networks
- Approximation analysis of CNNs from a feature extraction view
- A deep network construction that adapts to intrinsic dimensionality beyond the domain
- Fully hyperbolic convolutional neural networks
- Deep distributed convolutional neural networks: universality
This page was built for publication: Theory of deep convolutional neural networks. II: Spherical analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2057723)