New Error Bounds for Deep ReLU Networks Using Sparse Grids
From MaRDI portal
Publication:5025775
DOI10.1137/18M1189336MaRDI QIDQ5025775
Publication date: 3 February 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.08688
Related Items (36)
Int-Deep: a deep learning initialized iterative method for nonlinear problems ⋮ Stationary Density Estimation of Itô Diffusions Using Deep Learning ⋮ SelectNet: self-paced learning for high-dimensional partial differential equations ⋮ ReLU deep neural networks from the hierarchical basis perspective ⋮ Convergence Rate Analysis for Deep Ritz Method ⋮ Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions ⋮ DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method ⋮ A note on the applications of one primary function in deep neural networks ⋮ Simultaneous neural network approximation for smooth functions ⋮ Deep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional Problems ⋮ Neural network approximation: three hidden layers are enough ⋮ On the approximation of functions by tanh neural networks ⋮ Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs ⋮ Rates of approximation by ReLU shallow neural networks ⋮ DEEP EQUILIBRIUM NETS ⋮ Three ways to solve partial differential equations with neural networks — A review ⋮ Deep ReLU neural networks in high-dimensional approximation ⋮ Solving nonconvex energy minimization problems in martensitic phase transitions with a mesh-free deep learning approach ⋮ Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation ⋮ SignReLU neural network and its approximation ability ⋮ Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class ⋮ A Variational Neural Network Approach for Glacier Modelling with Nonlinear Rheology ⋮ Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs ⋮ Model reduction of coupled systems based on non-intrusive approximations of the boundary response maps ⋮ Deep ReLU networks and high-order finite element methods ⋮ Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems ⋮ A mesh-free method for interface problems using the deep learning approach ⋮ ExSpliNet: An interpretable and expressive spline-based neural network ⋮ Deep Network Approximation for Smooth Functions ⋮ Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units ⋮ Deep Network Approximation Characterized by Number of Neurons ⋮ Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem ⋮ Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth ⋮ Approximation of functions from korobov spaces by deep convolutional neural networks ⋮ Error bounds for ReLU networks with depth and width parameters ⋮ Approximating functions with multi-features by deep convolutional neural networks
Cites Work
- Spaces of functions of mixed smoothness and approximation from hyperbolic crosses
- Periodic interpolation and wavelets on sparse grids
- Multilayer feedforward networks are universal approximators
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Error bounds for approximations with deep ReLU networks
- Decomposition of Hardy Functions into Square Integrable Wavelets of Constant Shape
- Sparse grids
- Cubature, Approximation, and Isotropy in the Hypercube
- Sparse Spectral Approximations of High-Dimensional Problems Based on Hyperbolic Cross
- Approximation by superpositions of a sigmoidal function
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: New Error Bounds for Deep ReLU Networks Using Sparse Grids