Nonlinear approximation using Gaussian kernels
From MaRDI portal
Abstract: It is well-known that non-linear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learning and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. At heart it employs the strategy for nonlinear approximation of DeVore and Ron, but it selects kernels by a method that is not straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the approximation rates can be as high as needed and are essentially saturated only by the smoothness of the approximand.
Recommendations
Cites work
- scientific article; zbMATH DE number 412139 (Why is no real title available?)
- scientific article; zbMATH DE number 5359727 (Why is no real title available?)
- Approximation by radial bases and neural networks
- Approximation using scattered shifts of a multivariate function
- Fast rates for support vector machines using Gaussian kernels
- Fourier analysis of the approximation power of principal shift-invariant spaces
- Improved accuracy of multiquadric interpolation using variable shape parameters
- Learnability of Gaussians with flexible variances
- Some Maximal Inequalities
- The Runge phenomenon and spatially variable shape parameters in RBF interpolation
Cited in
(20)- Green's functions: taking another look at kernel approximation, radial basis functions, and splines
- An integral equation method for the numerical solution of the Burgers equation
- Nonuniform sampling and approximation in Sobolev space from perturbation of the framelet system
- On linear versus nonlinear approximation in the average case setting
- scientific article; zbMATH DE number 1857836 (Why is no real title available?)
- The rate of approximation of Gaussian radial basis neural networks in continuous function space
- Cardinal interpolation with Gaussian kernels
- Highly localized RBF Lagrange functions for finite difference methods on spheres
- Gaussian sum approximation for non-linear fixed-point prediction
- \textit{hp}-VPINNs: variational physics-informed neural networks with domain decomposition
- Weak form theory-guided neural network (TgNN-wf) for deep learning of subsurface single- and two-phase flow
- Kernel methods for the approximation of some key quantities of nonlinear systems
- Regular families of kernels for nonlinear approximation
- scientific article; zbMATH DE number 3855674 (Why is no real title available?)
- Nonlinear approximation via compositions
- scientific article; zbMATH DE number 5026764 (Why is no real title available?)
- Deep network approximation characterized by number of neurons
- Exponential tractability of \(L_2\)-approximation with function values
- Optimal approximation by one Gaussian function to probability density functions
- Gaussianization machines for non-Gaussian function estimation models
This page was built for publication: Nonlinear approximation using Gaussian kernels
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q982495)