Learning stiff chemical kinetics using extended deep neural operators
From MaRDI portal
Publication:6185234
DOI10.1016/j.cma.2023.116674arXiv2302.12645MaRDI QIDQ6185234
Bryan T. Susi, Somdatta Goswami, George Em. Karniadakis, Ameya D. Jagtap, Hessam Babaee
Publication date: 29 January 2024
Published in: Computer Methods in Applied Mechanics and Engineering (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2302.12645
Cites Work
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
- An improved algorithm for \textit{in situ} adaptive tabulation
- A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data
- Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- SVD perspectives for augmenting DeepONet flexibility and interpretability
- On the influence of over-parameterization in manifold based surrogates and deep neural operators
- Computationally efficient implementation of combustion chemistry usingin situadaptive tabulation
- Quasi steady state and partial equilibrium approximations: their relation and their validity
- Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks
- A quasi-steady-state solver for the stiff ordinary differential equations of reaction kinetics
This page was built for publication: Learning stiff chemical kinetics using extended deep neural operators