Neural Networks for Localized Approximation
From MaRDI portal
Publication:4317663
DOI10.2307/2153285zbMath0806.41020OpenAlexW2021676252MaRDI QIDQ4317663
Xin Li, Hrushikesh N. Mhaskar, Charles K. Chui
Publication date: 20 December 1994
Full work available at URL: https://doi.org/10.2307/2153285
Circuits, networks (94C99) Multidimensional problems (41A63) General harmonic expansions, frames (42C15) Approximation by other special function classes (41A30)
Related Items (33)
A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations ⋮ Deep distributed convolutional neural networks: Universality ⋮ Theoretical issues in deep networks ⋮ Full error analysis for the training of deep neural networks ⋮ Neural network interpolation operators activated by smooth ramp functions ⋮ UNIFIED FRAMEWORK FOR MLPs AND RBFNs: INTRODUCING CONIC SECTION FUNCTION NETWORKS ⋮ Limitations of the approximation capabilities of neural networks with one hidden layer ⋮ A deep network construction that adapts to intrinsic dimensionality beyond the domain ⋮ Approximation capabilities of neural networks on unbounded domains ⋮ Approximating smooth and sparse functions by deep neural networks: optimal approximation rates and saturation ⋮ Best approximation by linear combinations of characteristic functions of half-spaces. ⋮ Neural network interpolation operators optimized by Lagrange polynomial ⋮ Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation ⋮ Learning sparse and smooth functions by deep sigmoid nets ⋮ Neural network interpolation operators of multivariate functions ⋮ Deep learning theory of distribution regression with CNNs ⋮ Deep ReLU networks and high-order finite element methods ⋮ Unnamed Item ⋮ Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units ⋮ Application of radial basis function and generalized regression neural networks in nonlinear utility function specification for travel mode choice modelling ⋮ Limitations of shallow nets approximation ⋮ Rates of approximation by neural network interpolation operators ⋮ Deep neural networks for rotation-invariance approximation and learning ⋮ On simultaneous approximations by radial basis function neural networks ⋮ Approximative compactness of linear combinations of characteristic functions ⋮ Optimal Approximation with Sparsely Connected Deep Neural Networks ⋮ Approximating functions with multi-features by deep convolutional neural networks ⋮ Spline representation and redundancies of one-dimensional ReLU neural network models ⋮ Extension of localised approximation by neural networks ⋮ DNN expression rate analysis of high-dimensional PDEs: application to option pricing ⋮ Nonlinear approximation and (deep) ReLU networks ⋮ Approximation spaces of deep neural networks ⋮ Constructive approximate interpolation by neural networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- Multilayer feedforward networks are universal approximators
- Approximation properties of a multilayered feedforward artificial neural network
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- On Compactly Supported Spline Wavelets and a Duality Principle
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Neural Networks for Localized Approximation