Continuous limits of residual neural networks in case of large input data
From MaRDI portal
Publication:6098879
DOI10.2478/caim-2022-0008zbMath1512.35576arXiv2112.14150MaRDI QIDQ6098879
Giuseppe Visconti, Anna Thünen, Torsten Trimborn, Michael Herty
Publication date: 19 June 2023
Published in: Communications in Applied and Industrial Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2112.14150
Neural networks for/in biological studies, artificial life and related topics (92B20) PDEs in connection with classical thermodynamics and heat transfer (35Q79) Mean field games and control (49N80)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Simple bounds for the convergence of empirical and occupation measures in 1-Wasserstein distance
- On the rate of convergence in Wasserstein distance of the empirical measure
- Deep learning observables in computational fluid dynamics
- Controlling oscillations in high-order discontinuous Galerkin schemes using artificial viscosity tuned by neural networks
- Constraint-aware neural networks for Riemann problems
- Efficient implementation of weighted ENO schemes
- Mean-field and kinetic descriptions of neural differential equations
- A strategic learning algorithm for state-based games
- Detecting troubled-cells on two-dimensional unstructured grids using a neural network
- Mean-field limit of a spatially-extended Fitzhugh-Nagumo neural network
- Mean field analysis of neural networks: a central limit theorem
- A machine learning framework for data driven acceleration of computations of differential equations
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- An artificial neural network as a troubled-cell indicator
- Multiscale modeling of pedestrian dynamics
- Numerical Optimization
- CWENO: Uniformly accurate reconstructions for balance laws
- A mean field view of the landscape of two-layer neural networks
- Ensemble Kalman inversion: a derivative-free technique for machine learning tasks
- Replica-Mean-Field Limits for Intensity-Based Neural Networks
- Mean Field Analysis of Neural Networks: A Law of Large Numbers
- Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks
- Value Iteration Architecture Based Deep Learning for Intelligent Routing Exploiting Heterogeneous Computing Platforms
- Optimal Transport
This page was built for publication: Continuous limits of residual neural networks in case of large input data