The rate of approximation of Gaussian radial basis neural networks in continuous function space
From MaRDI portal
Publication:1940856
DOI10.1007/s10114-012-1369-4zbMath1263.41009MaRDI QIDQ1940856
Publication date: 8 March 2013
Published in: Acta Mathematica Sinica. English Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10114-012-1369-4
rate of convergence; approximation; modulus of continuity; Gaussian radial basis feedforward neural networks
41A63: Multidimensional problems
41A05: Interpolation in approximation theory
41A25: Rate of convergence, degree of approximation
Related Items
Deep Network Approximation Characterized by Number of Neurons, Construction and approximation for a class of feedforward neural networks with sigmoidal function, Nonlinear approximation via compositions, Almost optimal estimates for approximation and learning by radial basis function networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sobolev error estimates and a Bernstein inequality for scattered data interpolation via radial basis functions
- Nonlinear approximation using Gaussian kernels
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- On simultaneous approximations by radial basis function neural networks
- Relaxed conditions for radial-basis function networks to be universal approximators.
- On lower bounds in radial basis approximation
- On best approximation of classes by radial functions
- When is approximation by Gaussian networks necessarily a linear process?
- On quasi-interpolation by radial basis functions with scattered centres
- Error estimates and condition numbers for radial basis function interpolation
- Approximation by radial basis functions with finitely many centers
- Approximation by radial bases and neural networks