New Error Bounds for Deep ReLU Networks Using Sparse Grids

From MaRDI portal
Publication:5025775

DOI10.1137/18M1189336MaRDI QIDQ5025775

Hadrien Montanelli, Qiang Du

Publication date: 3 February 2022

Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1712.08688




Related Items (36)

Int-Deep: a deep learning initialized iterative method for nonlinear problemsStationary Density Estimation of Itô Diffusions Using Deep LearningSelectNet: self-paced learning for high-dimensional partial differential equationsReLU deep neural networks from the hierarchical basis perspectiveConvergence Rate Analysis for Deep Ritz MethodDeep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited FunctionsDeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle methodA note on the applications of one primary function in deep neural networksSimultaneous neural network approximation for smooth functionsDeep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional ProblemsNeural network approximation: three hidden layers are enoughOn the approximation of functions by tanh neural networksDeep ReLU neural network approximation in Bochner spaces and applications to parametric PDEsRates of approximation by ReLU shallow neural networksDEEP EQUILIBRIUM NETSThree ways to solve partial differential equations with neural networks — A reviewDeep ReLU neural networks in high-dimensional approximationSolving nonconvex energy minimization problems in martensitic phase transitions with a mesh-free deep learning approachOverall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisationSignReLU neural network and its approximation abilityDeep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassA Variational Neural Network Approach for Glacier Modelling with Nonlinear RheologyCollocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputsModel reduction of coupled systems based on non-intrusive approximations of the boundary response mapsDeep ReLU networks and high-order finite element methodsRectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systemsA mesh-free method for interface problems using the deep learning approachExSpliNet: An interpretable and expressive spline-based neural networkDeep Network Approximation for Smooth FunctionsBetter Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power UnitsDeep Network Approximation Characterized by Number of NeuronsError bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theoremDeep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of DepthApproximation of functions from korobov spaces by deep convolutional neural networksError bounds for ReLU networks with depth and width parametersApproximating functions with multi-features by deep convolutional neural networks



Cites Work


This page was built for publication: New Error Bounds for Deep ReLU Networks Using Sparse Grids