\(\mathrm{SU}(1,1)\) equivariant neural networks and application to robust Toeplitz Hermitian positive definite matrix classification (Q2117902)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | \(\mathrm{SU}(1,1)\) equivariant neural networks and application to robust Toeplitz Hermitian positive definite matrix classification |
scientific article |
Statements
\(\mathrm{SU}(1,1)\) equivariant neural networks and application to robust Toeplitz Hermitian positive definite matrix classification (English)
0 references
22 March 2022
0 references
Deep learning architectures are very efficient in representing complex data. If the problem at hand has symmetries, special architectures are proposed. For instance, convolutional neural networks respect translational symmetry. Group-equivariant neural networks for other group symmetries is a hot topic, and recently they gained huge attention. Recently, for the success of AlphaFold, a system solving the 3D protein structure prediction, group-equivariant neural networks have been instrumental. Group-Convolutional Neural Networks (G-CNN) have been investigated by several authors. The robustness and data effiency of G-CNN are very important in real-world applications. The authors propose an alternative method for building \(\mathrm{SU}(1, 1)\)-equivariant neural networks on the Poincaré disk. In order to obtain good estimates of the integral, they define a Monte-Carlo sampling procedure on \(\mathrm{SU}(1,1)\) by using Helgason-Fourier (HF) transforms of functions defined on homogeneous spaces. As an application they consider classification of Toeplitz-Hermitian positive definite matrices, after a suitable identificiation. This kind of classification can be used in Doppler signal processing tasks. For the entire collection see [Zbl 1482.94007].
0 references
homogeneous spaces
0 references
Burg algorithm
0 references
radar clutter
0 references
equivariant neural networks
0 references
hyperbolic embedding
0 references