Sympnets: intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems
From MaRDI portal
Publication:2057752
Abstract: We propose new symplectic networks (SympNets) for identifying Hamiltonian systems from data based on a composition of linear, activation and gradient modules. In particular, we define two classes of SympNets: the LA-SympNets composed of linear and activation modules, and the G-SympNets composed of gradient modules. Correspondingly, we prove two new universal approximation theorems that demonstrate that SympNets can approximate arbitrary symplectic maps based on appropriate activation functions. We then perform several experiments including the pendulum, double pendulum and three-body problems to investigate the expressivity and the generalization ability of SympNets. The simulation results show that even very small size SympNets can generalize well, and are able to handle both separable and non-separable Hamiltonian systems with data points resulting from short or long time steps. In all the test cases, SympNets outperform the baseline models, and are much faster in training and prediction. We also develop an extended version of SympNets to learn the dynamics from irregularly sampled data. This extended version of SympNets can be thought of as a universal model representing the solution to an arbitrary Hamiltonian system.
Recommendations
- Deep Hamiltonian neural networks based on symplectic integrators
- Learning Hamiltonians of constrained mechanical systems
- Artificial Intelligence and Soft Computing - ICAISC 2004
- Pseudo-Hamiltonian neural networks with state-dependent external forces
- Symplectic Gaussian process regression of maps in Hamiltonian systems
Cites work
- scientific article; zbMATH DE number 4078717 (Why is no real title available?)
- A Riemannian steepest descent approach over the inhomogeneous symplectic group: application to the averaging of linear optical systems
- A Riemannian-steepest-descent approach for optimization on the real symplectic group
- Approximation by superpositions of a sigmoidal function
- Computing Semiclassical Quantum Dynamics with Hagedorn Wavepackets
- Exact low-order polynomial expressions to compute the Kolmogoroff–Nagumo mean in the affine symplectic group of optical transference matrices
- From quantum to classical molecular dynamics: Reduced models and numerical analysis.
- Geometric Numerical Integration
- Lie-group-type neural system learning by manifold retractions
- Multilayer feedforward networks are universal approximators
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Polynomial approximations of symplectic dynamics and richness of chaos in non-hyperbolic area-preserving maps
- Symplectic analytically integrable decomposition algorithms: classification, derivation, and application to molecular dynamics, quantum and celestial mechanics simulations
- Symplectic geometry and quantum mechanics
Cited in
(47)- Symplectic integration of learned Hamiltonian systems
- A conservative hybrid deep learning method for Maxwell-Ampère-Nernst-Planck equations
- Structure-preserving recurrent neural networks for a class of Birkhoffian systems
- Dynamical Systems–Based Neural Networks
- Efficient Bayesian inference with latent Hamiltonian neural networks in no-U-turn sampling
- A minimalistic approach to physics-informed machine learning using neighbour lists as physics-optimized convolutions for inverse problems involving particle systems
- Structure preservation for the deep neural network multigrid solver
- Lax-Oleinik-type formulas and efficient algorithms for certain high-dimensional optimal control problems
- Render unto numerics: orthogonal polynomial neural operator for PDEs with nonperiodic boundary conditions
- Direct Poisson neural networks: learning non-symplectic mechanical systems
- Hamiltonian operator inference: physics-preserving learning of reduced-order models for canonical Hamiltonian systems
- Symplectic neural networks in Taylor series form for Hamiltonian systems
- A Recursively Recurrent Neural Network (R2N2) Architecture for Learning Iterative Algorithms
- Bayesian identification of nonseparable Hamiltonians with multiplicative noise using deep learning and reduced-order modeling
- Structure-preserving neural networks
- A neural network multigrid solver for the Navier-Stokes equations
- Locally-symplectic neural networks for learning volume-preserving dynamics
- Pseudo-Hamiltonian neural networks with state-dependent external forces
- Approximation capabilities of measure-preserving neural networks
- NySALT: Nyström-type inference-based schemes adaptive to large time-stepping
- Neural network architectures using min-plus algebra for solving certain high-dimensional optimal control problems and Hamilton-Jacobi PDEs
- Unit triangular factorization of the matrix symplectic group
- Optimal unit triangular factorization of symplectic matrices
- VPNets: volume-preserving neural networks for learning source-free dynamics
- Optimal Dirichlet boundary control by Fourier neural operators applied to nonlinear optics
- A hybrid approach for solving the gravitational \(N\)-body problem with artificial neural networks
- Implementation and (inverse modified) error analysis for implicitly templated ODE-nets
- Symplectic Gaussian process regression of maps in Hamiltonian systems
- Convolution hierarchical deep-learning neural networks (C-HiDeNN): finite elements, isogeometric analysis, tensor decomposition, and beyond
- Symmetry preservation in Hamiltonian systems: simulation and learning
- Deep learning method for finding eigenpairs in Sturm-Liouville eigenvalue problems
- Artificial Intelligence and Soft Computing - ICAISC 2004
- SympOCnet: Solving Optimal Control Problems with Applications to High-Dimensional Multiagent Path Planning Problems
- A structure-preserving neural differential operator with embedded Hamiltonian constraints for modeling structural dynamics
- A generalized framework of neural networks for Hamiltonian systems
- Deep Hamiltonian neural networks based on symplectic integrators
- Learning Hamiltonians of constrained mechanical systems
- Lie-Poisson neural networks (LPNets): data-based computing of Hamiltonian systems with symmetries
- Symplectic Model Reduction of Hamiltonian Systems on Nonlinear Manifolds and Approximation with Weakly Symplectic Autoencoder
- Port-metriplectic neural networks: thermodynamics-informed machine learning of complex physical systems
- Pseudo-Hamiltonian neural networks for learning partial differential equations
- Pseudo-Hamiltonian system identification
- Learning of discrete models of variational PDEs from data
- Variational learning of Euler-Lagrange dynamics from data
- Learning Hamiltonian systems with mono-implicit Runge-Kutta methods
- Neural networks enforcing physical symmetries in nonlinear dynamical lattices: the case example of the Ablowitz-Ladik model
- Symplectic learning for Hamiltonian neural networks
This page was built for publication: Sympnets: intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2057752)