Dynamical Systems–Based Neural Networks
From MaRDI portal
Publication:6181900
dynamical systemsneural networksuniversal approximation theoremLipschitz networksstructure-preserving deep learning
Artificial neural networks and deep learning (68T07) Multistep, Runge-Kutta and extrapolation methods for ordinary differential equations (65L06) Discretization methods and integrators (symplectic, variational, geometric, etc.) for dynamical systems (37M15) Numerical methods for initial value problems involving ordinary differential equations (65L05)
Abstract: Neural networks have gained much interest because of their effectiveness in many applications. However, their mathematical properties are generally not well understood. If there is some underlying geometric structure inherent to the data or to the function to approximate, it is often desirable to take this into account in the design of the neural network. In this work, we start with a non-autonomous ODE and build neural networks using a suitable, structure-preserving, numerical time-discretisation. The structure of the neural network is then inferred from the properties of the ODE vector field. Besides injecting more structure into the network architectures, this modelling procedure allows a better theoretical understanding of their behaviour. We present two universal approximation results and demonstrate how to impose some particular properties on the neural networks. A particular focus is on 1-Lipschitz architectures including layers that are not 1-Lipschitz. These networks are expressive and robust against adversarial attacks, as shown for the CIFAR-10 dataset.
Recommendations
- Deep learning via dynamical systems: an approximation perspective
- Designing stable neural networks using convex analysis and ODEs
- Stable architectures for deep neural networks
- Deep Hamiltonian neural networks based on symplectic integrators
- Sympnets: intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems
Cites work
- scientific article; zbMATH DE number 1405266 (Why is no real title available?)
- A high-order conservative Patankar-type discretisation for stiff systems of production--destruction equations
- A proposal on machine learning via dynamical systems
- A stability result for switched systems with multiple equilibria
- Basic problems in stability and design of switched systems
- Control on the manifolds of mappings with a view to the deep learning
- Convolutional proximal neural networks and plug-and-play algorithms
- Deep Hamiltonian neural networks based on symplectic integrators
- Deep learning via dynamical systems: an approximation perspective
- Deep limits of residual neural networks
- Deep neural networks motivated by partial differential equations
- Geometric Numerical Integration
- Hamiltonian Deep Neural Networks Guaranteeing Nonvanishing Gradients by Design
- Learning Hamiltonians of constrained mechanical systems
- Locally-symplectic neural networks for learning volume-preserving dynamics
- Neural ODE Control for Classification, Approximation, and Transport
- Non-local decomposition of vector fields
- Order theory for discrete gradient methods
- PDE-based group equivariant convolutional neural networks
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Positivity-preserving methods for ordinary differential equations
- Reducibility and contractivity of Runge-Kutta methods revisited
- Simulating Hamiltonian Dynamics
- Solving inverse problems using data-driven models
- Splitting methods
- Structure-preserving deep learning
- Switching in systems and control
- Sympnets: intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems
- The modern mathematics of deep learning
- \(L^p\) approximation of maps by diffeomorphisms
Cited in
(12)- Constrained dynamics, stochastic numerical methods and the modeling of complex systems. Abstracts from the workshop held May 26--31, 2024
- Inferring the dynamics of oscillatory systems using recurrent neural networks
- Designing stable neural networks using convex analysis and ODEs
- A recurrent neural network for modelling dynamical systems
- ABBA neural networks: coping with positivity, expressivity, and robustness
- Dynamics and architecture for neural computation
- \textit{dynoNet}: a neural network architecture for learning dynamical systems
- Modeling of complex dynamic systems using differential neural networks with the incorporation of a priori knowledge
- Phase portrait approximation using dynamic neural networks
- scientific article; zbMATH DE number 2221002 (Why is no real title available?)
- Structure-preserving recurrent neural networks for a class of Birkhoffian systems
- Self-organizing feedforward neural network and modeling of nonlinear dynamical system
This page was built for publication: Dynamical Systems–Based Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6181900)