PDE-based group equivariant convolutional neural networks
From MaRDI portal
Publication:6156050
DOI10.1007/S10851-022-01114-XarXiv2001.09046OpenAlexW3001808426WikidataQ114226025 ScholiaQ114226025MaRDI QIDQ6156050FDOQ6156050
Authors: Bart M. N. Smets, Jim Portegies, E. J. Bekkers, Remco Duits
Publication date: 12 June 2023
Published in: Journal of Mathematical Imaging and Vision (Search for Journal in Brave)
Abstract: We present a PDE-based framework that generalizes Group equivariant Convolutional Neural Networks (G-CNNs). In this framework, a network layer is seen as a set of PDE-solvers where geometrically meaningful PDE-coefficients become the layer's trainable weights. Formulating our PDEs on homogeneous spaces allows these networks to be designed with built-in symmetries such as rotation in addition to the standard translation equivariance of CNNs. Having all the desired symmetries included in the design obviates the need to include them by means of costly techniques such as data augmentation. We will discuss our PDE-based G-CNNs (PDE-G-CNNs) in a general homogeneous space setting while also going into the specifics of our primary case of interest: roto-translation equivariance. We solve the PDE of interest by a combination of linear group convolutions and non-linear morphological group convolutions with analytic kernel approximations that we underpin with formal theorems. Our kernel approximations allow for fast GPU-implementation of the PDE-solvers, we release our implementation with this article in the form of the LieTorch extension to PyTorch, available at https://gitlab.com/bsmetsjr/lietorch . Just like for linear convolution a morphological convolution is specified by a kernel that we train in our PDE-G-CNNs. In PDE-G-CNNs we do not use non-linearities such as max/min-pooling and ReLUs as they are already subsumed by morphological convolutions. We present a set of experiments to demonstrate the strength of the proposed PDE-G-CNNs in increasing the performance of deep learning based imaging applications with far fewer parameters than traditional CNNs.
Full work available at URL: https://arxiv.org/abs/2001.09046
Cites Work
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Coherent states, wavelets and their generalizations
- Introduction to Smooth Manifolds
- Title not available (Why is that?)
- A cortical based model of perceptual completion in the roto-translation space
- Nonsmooth analysis and Hamilton--Jacobi equations on Riemannian manifolds
- A cortical-inspired geometry for contour perception and motion integration
- Title not available (Why is that?)
- Abstract harmonic analysis of continuous wavelet transforms
- Geometric partial differential equations and image analysis
- Geodesic methods in computer vision and graphics
- Evolution equations on Gabor transforms and their applications
- Numerical approaches for linear left-invariant diffusions on \(SE(2)\), their comparison to exact solutions, and their applications in retinal imaging
- Left-invariant parabolic evolutions on \(SE(2)\) and contour enhancement via invertible orientation scores. I: Linear left-invariant diffusion equations on \(SE(2)\)
- Morphological and linear scale spaces for fiber enhancement in DW-MRI
- A PDE approach to data-driven sub-Riemannian geodesics in \(\mathrm{SE}(2)\)
- Collaborative total variation: a general framework for vectorial TV models
- Sub-Riemannian mean curvature flow for image processing
- Hypoelliptic diffusion and human vision: a semidiscrete new twist
- Left-invariant parabolic evolutions on $SE(2)$ and contour enhancement via invertible orientation scores Part II: Nonlinear left-invariant diffusions on invertible orientation scores
- Title not available (Why is that?)
- First order algorithms in variational image processing
- Title not available (Why is that?)
- Integral representations of resolvents and semigroups
- Functional inequalities and Hamilton-Jacobi equations in geodesic spaces
- Weighted subcoercive operators on Lie groups
- Title not available (Why is that?)
- Estimations of the heat kernel on homogeneous spaces
- A VERSION OF THE HOPF-LAX FORMULA IN THE HEISENBERG GROUP
- Metric Hopf-Lax formula with semicontinuous data
- Design and processing of invertible orientation scores of 3D images
- Deep neural networks motivated by partial differential equations
- Equivariant deep learning via morphological and linear scale space PDEs on the space of positions and orientations
- Translating numerical concepts for PDEs into neural architectures
- A proposal on machine learning via dynamical systems
- Computer Vision - ECCV 2004
- An operational calculus for the Euclidean motion group with applications in robotics and polymer science
- Cyclic schemes for PDE-based image analysis
- A geometric model of multi-scale orientation preference maps via Gabor functions
- New approximation of a scale space kernel on SE(3) and applications in neuroimaging
- Total variation and mean curvature PDEs on the homogeneous space of positions and orientations
- Geometrical optical illusion via sub-Riemannian geodesics in the roto-translation group
- Total roto-translational variation
- A bio-inspired geometric model for sound reconstruction
- An introduction to the geometry of homogeneous spaces
Cited In (6)
- Deep limits of residual neural networks
- Local binary patterns of segments of a binary object for shape analysis
- Can generalised divergences help for invariant neural networks?
- Functional properties of PDE-based group equivariant convolutional neural networks
- Analysis of (sub-)Riemannian PDE-G-CNNs
- Dynamical Systems–Based Neural Networks
This page was built for publication: PDE-based group equivariant convolutional neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6156050)