Efficient and stable SAV-based methods for gradient flows arising from deep learning
From MaRDI portal
Publication:6497260
DOI10.1016/J.JCP.2024.112911MaRDI QIDQ6497260
Jie Shen, Zhiping Mao, Ziqi Ma
Publication date: 6 May 2024
Published in: Journal of Computational Physics (Search for Journal in Brave)
Artificial intelligence (68Txx) Numerical methods for partial differential equations, initial value and time-dependent initial-boundary value problems (65Mxx) Parabolic equations and parabolic systems (35Kxx)
Cites Work
- Unnamed Item
- Unnamed Item
- Machine learning from a continuous viewpoint. I
- The scalar auxiliary variable (SAV) approach for gradient flows
- DGM: a deep learning algorithm for solving partial differential equations
- The Barron space and the flow-induced function spaces for neural network models
- Improving the accuracy and consistency of the scalar auxiliary variable (SAV) method with relaxation
- Mean field analysis of neural networks: a central limit theorem
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- A mean-field optimal control formulation of deep learning
- A proposal on machine learning via dynamical systems
- Universal approximation bounds for superpositions of a sigmoidal function
- Stable architectures for deep neural networks
- Deep backward schemes for high-dimensional nonlinear PDEs
- A mean field view of the landscape of two-layer neural networks
- Solving high-dimensional partial differential equations using deep learning
- DeepXDE: A Deep Learning Library for Solving Differential Equations
- Learning representations by back-propagating errors
- A Stochastic Approximation Method
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Efficient and stable SAV-based methods for gradient flows arising from deep learning