Universal approximations of invariant maps by neural networks
From MaRDI portal
Publication:2117338
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Applications of group representations to physics and other areas of science (20C35) Neural nets and related approaches to inference from stochastic processes (62M45) Multidimensional problems (41A63) Actions of groups on commutative rings; invariant theory (13A50) Approximation by other special function classes (41A30)
Abstract: We describe generalizations of the universal approximation theorem for neural networks to maps invariant or equivariant with respect to linear representations of groups. Our goal is to establish network-like computational models that are both invariant/equivariant and provably complete in the sense of their ability to approximate any continuous invariant/equivariant map. Our contribution is three-fold. First, in the general case of compact groups we propose a construction of a complete invariant/equivariant network using an intermediate polynomial layer. We invoke classical theorems of Hilbert and Weyl to justify and simplify this construction; in particular, we describe an explicit complete ansatz for approximation of permutation-invariant maps. Second, we consider groups of translations and prove several versions of the universal approximation theorem for convolutional networks in the limit of continuous signals on euclidean spaces. Finally, we consider 2D signal transformations equivariant with respect to the group SE(2) of rigid euclidean motions. In this case we introduce the "charge--conserving convnet" -- a convnet-like computational model based on the decomposition of the feature space into isotypic representations of SO(2). We prove this model to be a universal approximator for continuous SE(2)--equivariant signal transformations.
Recommendations
- Probabilistic symmetries and invariant neural networks
- What is... an Equivariant Neural Network?
- A unifying framework for invariant pattern recognition
- Group invariance, stability to deformations, and complexity of deep convolutional representations
- Universality of deep convolutional neural networks
Cites work
- scientific article; zbMATH DE number 18587 (Why is no real title available?)
- scientific article; zbMATH DE number 2003146 (Why is no real title available?)
- scientific article; zbMATH DE number 1461253 (Why is no real title available?)
- scientific article; zbMATH DE number 835752 (Why is no real title available?)
- scientific article; zbMATH DE number 1405266 (Why is no real title available?)
- Approximation by superposition of sigmoidal and radial basis functions
- Approximation by superpositions of a sigmoidal function
- Deep learning
- Endomorphisms and automorphisms of the shift dynamical system
- Group invariant scattering
- Multilayer feedforward networks are universal approximators
- On invariance and selectivity in representation learning
- Rotation covariant image processing for biomedical applications
- TDI-subspaces of \(C(\mathbb{R}^ d)\) and some density problems from neural networks
- Zeros of equivariant vector fields: Algorithms for an invariant approach
Cited in
(21)- Equivariant deep learning via morphological and linear scale space PDEs on the space of positions and orientations
- Full error analysis for the training of deep neural networks
- On the finite representation of linear group equivariant operators via permutant measures
- Ehresmann connections and feedforward neural networks
- \(\mathrm{SU}(1,1)\) equivariant neural networks and application to robust Toeplitz Hermitian positive definite matrix classification
- Iterative SE(3)-transformers
- A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations
- scientific article; zbMATH DE number 1294024 (Why is no real title available?)
- A unified Fourier slice method to derive ridgelet transform for a variety of depth-2 neural networks
- scientific article; zbMATH DE number 5163730 (Why is no real title available?)
- Probabilistic symmetries and invariant neural networks
- Approximation capabilities of measure-preserving neural networks
- The universal approximation theorem for complex-valued neural networks
- Neural network approximation of continuous functions in high dimensions with applications to inverse problems
- Statistical theory for image classification using deep convolutional neural network with cross-entropy loss under the hierarchical max-pooling model
- What is... an Equivariant Neural Network?
- Homogeneous vector bundles and \(G\)-equivariant convolutional neural networks
- Scale equivariant neural networks with morphological scale-spaces
- TransNet: shift invariant transformer network for side channel analysis
- Universal approximation of symmetric and anti-symmetric functions
- Piecewise integrable neural network: an interpretable chaos identification framework
This page was built for publication: Universal approximations of invariant maps by neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2117338)