Lower bounds for artificial neural network approximations: a proof that shallow neural networks fail to overcome the curse of dimensionality
From MaRDI portal
Publication:6155895
Abstract: Artificial neural networks (ANNs) have become a very powerful tool in the approximation of high-dimensional functions. Especially, deep ANNs, consisting of a large number of hidden layers, have been very successfully used in a series of practical relevant computational problems involving high-dimensional input data ranging from classification tasks in supervised learning to optimal decision problems in reinforcement learning. There are also a number of mathematical results in the scientific literature which study the approximation capacities of ANNs in the context of high-dimensional target functions. In particular, there are a series of mathematical results in the scientific literature which show that sufficiently deep ANNs have the capacity to overcome the curse of dimensionality in the approximation of certain target function classes in the sense that the number of parameters of the approximating ANNs grows at most polynomially in the dimension of the target functions under considerations. In the proofs of several of such high-dimensional approximation results it is crucial that the involved ANNs are sufficiently deep and consist a sufficiently large number of hidden layers which grows in the dimension of the considered target functions. It is the topic of this work to look a bit more detailed to the deepness of the involved ANNs in the approximation of high-dimensional target functions. In particular, the main result of this work proves that there exists a concretely specified sequence of functions which can be approximated without the curse of dimensionality by sufficiently deep ANNs but which cannot be approximated without the curse of dimensionality if the involved ANNs are shallow or not deep enough.
Recommendations
- Probabilistic lower bounds for approximation by shallow perceptron networks
- Dimension-independent bounds on the degree of approximation by neural networks
- Lower bounds for approximation by MLP neural networks
- scientific article; zbMATH DE number 1182753
- Approximation and estimation bounds for artificial neural networks
- On the approximation by neural networks with bounded number of neurons in hidden layers
- scientific article; zbMATH DE number 1843047
- Rates of approximation by ReLU shallow neural networks
- Deep vs. shallow networks: an approximation theory perspective
- An Integral Upper Bound for Neural Network Approximation
Cites work
- scientific article; zbMATH DE number 1231230 (Why is no real title available?)
- scientific article; zbMATH DE number 949396 (Why is no real title available?)
- scientific article; zbMATH DE number 1405266 (Why is no real title available?)
- scientific article; zbMATH DE number 3231743 (Why is no real title available?)
- A Remark on Stirling's Formula
- A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning
- An overview on deep learning-based approximation methods for partial differential equations
- Analysis of the generalization error: empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations
- Approximation and estimation bounds for artificial neural networks
- Approximation and learning of convex superpositions
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula>
- Approximation by superpositions of a sigmoidal function
- Comparison of worst case errors in linear and neural network approximation
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- DNN expression rate analysis of high-dimensional PDEs: application to option pricing
- Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms
- Deep vs. shallow networks: an approximation theory perspective
- Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks
- Error bounds for approximations with deep ReLU networks
- Full error analysis for the training of deep neural networks
- Geometric Upper Bounds on Rates of Variable-Basis Approximation
- Lower bounds for approximation by MLP neural networks
- Minimization of Error Functionals over Perceptron Networks
- Multilayer feedforward networks are universal approximators
- On the approximation by single hidden layer feedforward neural networks with fixed weights
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Optimal approximation with sparsely connected deep neural networks
- Proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
- Rates of convex approximation in non-Hilbert spaces
- Space-time error estimates for deep neural network approximations for differential equations
- Tractability of multivariate problems. Volume I: Linear information
- Tractability of multivariate problems. Volume II: Standard information for functionals.
- Uniform error estimates for artificial neural network approximations for heat equations
- Universal approximation bounds for superpositions of a sigmoidal function
- Wahrscheinlichkeitstheorie
Cited in
(3)
This page was built for publication: Lower bounds for artificial neural network approximations: a proof that shallow neural networks fail to overcome the curse of dimensionality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6155895)