Complexity of Gaussian-radial-basis networks approximating smooth functions
From MaRDI portal
Publication:998978
DOI10.1016/j.jco.2008.08.001zbMath1162.65006OpenAlexW1979554434MaRDI QIDQ998978
Paul C. Kainen, Marcello Sanguineti, Vera Kurková
Publication date: 30 January 2009
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2008.08.001
rates of approximationmodel complexitymultivariable approximationapproximate smooth functionsBessel and Sobolev normsGaussian-radial-basis-function networkstractability of approximationvariation norms
Related Items (18)
Probabilistic lower bounds for approximation by shallow perceptron networks ⋮ Suboptimal solutions to dynamic optimization problems via approximations of the policy functions ⋮ Wavelet neural networks functional approximation and application ⋮ Comparing fixed and variable-width Gaussian networks ⋮ Two fast and accurate heuristic RBF learning rules for data classification ⋮ The rate of approximation of Gaussian radial basis neural networks in continuous function space ⋮ Complexity estimates based on integral transforms induced by computational units ⋮ Lower bounds for artificial neural network approximations: a proof that shallow neural networks fail to overcome the curse of dimensionality ⋮ Error estimates of quasi-interpolation and its derivatives ⋮ Accuracy of approximations of solutions to Fredholm equations by kernel methods ⋮ Dynamic programming and value-function approximation in sequential decision problems: error analysis and numerical results ⋮ Can dictionary-based computational models outperform the best linear ones? ⋮ New insights into Witsenhausen's counterexample ⋮ Estimates of variation with respect to a set and applications to optimization problems ⋮ Some comparisons of complexity in dictionary-based and linear computational models ⋮ Complexity of Shallow Networks Representing Finite Mappings ⋮ Approximation schemes for functional optimization problems ⋮ An Integral Upper Bound for Neural Network Approximation
Cites Work
- Sobolev error estimates and a Bernstein inequality for scattered data interpolation via radial basis functions
- On the tractability of multivariate integration and approximation by neural networks
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Approximation by superposition of sigmoidal and radial basis functions
- On best approximation by ridge functions
- Tractability and strong tractability of linear multivariate problems
- Complexity of weighted approximation over \(\mathbb{R}^d\)
- When is approximation by Gaussian networks necessarily a linear process?
- Minimization of Error Functionals over Perceptron Networks
- Real Interpolation of Sobolev Spaces on Subdomains of Rn
- Universal approximation bounds for superpositions of a sigmoidal function
- Dimension-independent bounds on the degree of approximation by neural networks
- Comparison of worst case errors in linear and neural network approximation
- Error bounds for approximation with neural networks
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Complexity of Gaussian-radial-basis networks approximating smooth functions