The rate of approximation of Gaussian radial basis neural networks in continuous function space
From MaRDI portal
Publication:1940856
DOI10.1007/S10114-012-1369-4zbMath1263.41009OpenAlexW2402475499MaRDI QIDQ1940856
Publication date: 8 March 2013
Published in: Acta Mathematica Sinica. English Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10114-012-1369-4
rate of convergenceapproximationmodulus of continuityGaussian radial basis feedforward neural networks
Multidimensional problems (41A63) Interpolation in approximation theory (41A05) Rate of convergence, degree of approximation (41A25)
Related Items (4)
Nonlinear approximation via compositions ⋮ Construction and approximation for a class of feedforward neural networks with sigmoidal function ⋮ Deep Network Approximation Characterized by Number of Neurons ⋮ Almost optimal estimates for approximation and learning by radial basis function networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sobolev error estimates and a Bernstein inequality for scattered data interpolation via radial basis functions
- Nonlinear approximation using Gaussian kernels
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- On simultaneous approximations by radial basis function neural networks
- Relaxed conditions for radial-basis function networks to be universal approximators.
- On lower bounds in radial basis approximation
- On best approximation of classes by radial functions
- When is approximation by Gaussian networks necessarily a linear process?
- On quasi-interpolation by radial basis functions with scattered centres
- Error estimates and condition numbers for radial basis function interpolation
- Approximation by radial basis functions with finitely many centers
- Approximation by radial bases and neural networks
This page was built for publication: The rate of approximation of Gaussian radial basis neural networks in continuous function space