An Integral Upper Bound for Neural Network Approximation
From MaRDI portal
Publication:3399370
DOI10.1162/neco.2009.04-08-745zbMath1186.68370OpenAlexW1987961932WikidataQ48503903 ScholiaQ48503903MaRDI QIDQ3399370
Publication date: 12 October 2009
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: http://www.nusl.cz/ntk/nusl-39633
Related Items
Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces ⋮ Approximation by max-product neural network operators of Kantorovich type ⋮ Saturation classes for MAX-product neural network operators activated by sigmoidal functions ⋮ Complexity estimates based on integral transforms induced by computational units ⋮ Convergence results for a family of Kantorovich max-product neural network operators in a multivariate setting ⋮ Convergence for a family of neural network operators in Orlicz spaces ⋮ Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation ⋮ Piecewise convexity of artificial neural networks ⋮ The construction and approximation of the neural network with two weights
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Rates of convex approximation in non-Hilbert spaces
- Approximation and learning of convex superpositions
- A Sobolev-type upper bound for rates of approximation by linear combinations of Heaviside plane waves
- Universal approximation bounds for superpositions of a sigmoidal function
- Error Estimates for Approximate Optimization by the Extended Ritz Method
This page was built for publication: An Integral Upper Bound for Neural Network Approximation