D2NO: efficient handling of heterogeneous input function spaces with distributed deep neural operators
From MaRDI portal
Publication:6566058
Recommendations
- MODNO: multi-operator learning with distributed neural operators
- Physics-informed discretization-independent deep compositional operator network
- Neural operator induced Gaussian process framework for probabilistic solution of parametric partial differential equations
- Semi-supervised invertible neural operators for Bayesian inverse problems
Cites work
- A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data
- B-DeepONet: an enhanced Bayesian deeponet for solving noisy parametric PDEs using accelerated replica exchange SGLD
- Convergence rate of DeepONets for learning operators arising from advection-diffusion equations
- DeepM\&Mnet for hypersonics: predicting the coupled flow and finite-rate chemistry behind a normal shock using neural-network approximation of operators
- DeepM\&Mnet: inferring the electroconvection multiphysics fields based on operator approximation by neural networks
- Learning partial differential equations via data discovery and sparse optimization
- MIONet: Learning Multiple-Input Operators via Tensor Product
- Neural operator prediction of linear instability waves in high-speed boundary layers
- Reduced operator inference for nonlinear partial differential equations
- Reliable extrapolation of deep neural operators informed by physics or sparse observations
- Scalable uncertainty quantification for deep operator networks using randomized priors
- Sparse dynamics for partial differential equations
- Uncertainty quantification in scientific machine learning: methods, metrics, and comparisons
Cited in
(3)
This page was built for publication: D2NO: efficient handling of heterogeneous input function spaces with distributed deep neural operators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6566058)