Error bounds for approximation with neural networks
From MaRDI portal
Publication:5959036
DOI10.1006/jath.2001.3613zbMath1004.41007OpenAlexW2091297126MaRDI QIDQ5959036
Martin Burger, Andreas Neubauer
Publication date: 28 April 2002
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/48d823e894306e4b0cff3d54eee343da415474e7
Rate of convergence, degree of approximation (41A25) Algorithms for approximation of functions (65D15) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46) Numerical approximation and evaluation of special functions (33F05)
Related Items
Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations ⋮ A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations ⋮ Linear and nonlinear approximation of spherical radial basis function networks ⋮ Approximation by ridge function fields over compact sets ⋮ Nonlinear function approximation: computing smooth solutions with an adaptive greedy algorithm ⋮ Consistency of Ridge Function Fields for Varying Nonparametric Regression ⋮ Full error analysis for the training of deep neural networks ⋮ Suboptimal solutions to dynamic optimization problems via approximations of the policy functions ⋮ Approximation by neural networks with a bounded number of nodes at each level ⋮ Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation ⋮ Neural ODE Control for Classification, Approximation, and Transport ⋮ Convex regularization in statistical inverse learning problems ⋮ The mechanism of additive composition ⋮ Estimation of approximating rate for neural network in \(L^p_w\) spaces ⋮ The ridge function representation of polynomials and an application to neural networks ⋮ Approximation Properties of Ridge Functions and Extreme Learning Machines ⋮ Almost optimal estimates for approximation and learning by radial basis function networks ⋮ Quantitative estimates involving K-functionals for neural network-type operators ⋮ Complexity of Gaussian-radial-basis networks approximating smooth functions ⋮ Learning a function from noisy samples at a finite sparse set of points
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Generalization bounds for function approximation from scattered noisy data
- Multilayer feedforward networks are universal approximators
- Approximation properties of a multilayered feedforward artificial neural network
- Degree of approximation by neural and translation networks with a single hidden layer
- Approximation by radial basis functions with finitely many centers
- Convergence rates of certain approximate solutions to Fredholm integral equations of the first kind
- Approximation by Ridge Functions and Neural Networks
- Universal approximation bounds for superpositions of a sigmoidal function
- Training neural networks with noisy data as an ill-posed problem