Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
DOI10.1016/j.jcp.2019.109136zbMath1453.68165arXiv1906.01170OpenAlexW2948551291WikidataQ126761242 ScholiaQ126761242MaRDI QIDQ2223034
Kenji Kawaguchi, Ameya D. Jagtap, George Em. Karniadakis
Publication date: 28 January 2021
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.01170
inverse problemspartial differential equationsmachine learningphysics-informed neural networksbad minimadeep learning benchmarks
Artificial neural networks and deep learning (68T07) Algorithms for approximation of functions (65D15) Numerical methods for inverse problems for initial value and initial-boundary value problems involving PDEs (65M32)
Related Items (93)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New travelling wave solutions to the Boussinesq and the Klein-Gordon equations
- Spectral and finite difference solutions of the Burgers equations
- Inferring solutions of differential equations using noisy multi-fidelity data
- Machine learning of linear differential equations using Gaussian processes
- Hidden physics models: machine learning of nonlinear partial differential equations
- Data-driven discovery of PDEs in complex datasets
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- A paradigm for data-driven predictive modeling using field inversion and machine learning
- Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations
- Deep learning of vortex-induced vibrations
- Bayesian Numerical Homogenization
This page was built for publication: Adaptive activation functions accelerate convergence in deep and physics-informed neural networks