Error analysis for deep neural network approximations of parametric hyperbolic conservation laws

From MaRDI portal
Publication:6405060

arXiv2207.07362MaRDI QIDQ6405060FDOQ6405060


Authors: Tim De Ryck, Siddhartha Mishra Edit this on Wikidata


Publication date: 15 July 2022

Abstract: We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments.













This page was built for publication: Error analysis for deep neural network approximations of parametric hyperbolic conservation laws

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6405060)