Reliable extrapolation of deep neural operators informed by physics or sparse observations
From MaRDI portal
Publication:6097626
DOI10.1016/j.cma.2023.116064arXiv2212.06347OpenAlexW4367674997MaRDI QIDQ6097626
Anran Jiao, Min Zhu, Handi Zhang, Lu Lu, George Em. Karniadakis
Publication date: 6 June 2023
Published in: Computer Methods in Applied Mechanics and Engineering (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2212.06347
fine-tuningneural operatorsextrapolation complexitymultifidelity learningout-of-distribution inference
Related Items
Fourier-DeepONet: Fourier-enhanced deep operator networks for full waveform inversion with improved accuracy, generalizability, and robustness ⋮ Branched latent neural maps ⋮ A super-real-time three-dimension computing method of digital twins in space nuclear power
Cites Work
- Unnamed Item
- Multilayer feedforward networks are universal approximators
- A physics-informed operator regression framework for extracting data-driven continuum models
- Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness
- Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs
- B-DeepONet: an enhanced Bayesian deeponet for solving noisy parametric PDEs using accelerated replica exchange SGLD
- Neural operator prediction of linear instability waves in high-speed boundary layers
- DeepM\&Mnet: inferring the electroconvection multiphysics fields based on operator approximation by neural networks
- DeepM\&Mnet for hypersonics: predicting the coupled flow and finite-rate chemistry behind a normal shock using neural-network approximation of operators
- A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data
- Deep solution operators for variational inequalities via proximal neural networks
- A composite neural network that learns from multi-fidelity data: application to function approximation and inverse PDE problems
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- A physics-informed variational DeepONet for predicting crack path in quasi-brittle materials
- Scalable uncertainty quantification for deep operator networks using randomized priors
- Nonlocal kernel network (NKN): a stable and resolution-independent deep neural network
- Interfacing finite elements with deep neural operators for fast multiscale modeling of mechanics problems
- A comprehensive study of non-adaptive and residual-based adaptive sampling for physics-informed neural networks
- Uncertainty quantification in scientific machine learning: methods, metrics, and comparisons
- On the influence of over-parameterization in manifold based surrogates and deep neural operators
- On a Formula for the L2 Wasserstein Metric between Measures on Euclidean and Hilbert Spaces
- Overcoming catastrophic forgetting in neural networks
- Predicting the output from a complex computer code when fast approximations are available
- Deep double descent: where bigger models and more data hurt*
- MIONet: Learning Multiple-Input Operators via Tensor Product
- Error estimates for DeepONets: a deep learning framework in infinite dimensions
- Exponential convergence of mixed hp-DGFEM for the incompressible Navier–Stokes equations in ℝ2
- DeepXDE: A Deep Learning Library for Solving Differential Equations
- A seamless multiscale operator neural network for inferring bubble dynamics
- Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Bi-fidelity modeling of uncertain and partially unknown systems using DeepONets