Approximations by multivariate perturbed neural network operators
DOI10.1142/S0219530515500293zbMath1364.41005OpenAlexW1889662278MaRDI QIDQ5267949
Publication date: 13 June 2017
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530515500293
multivariate neural network approximationmultivariate modulus of continuitymultivariate Jackson type inequalitymultivariate perturbation of operators
Inequalities in approximation (Bernstein, Jackson, Nikol'ski?-type inequalities) (41A17) Rate of convergence, degree of approximation (41A25) Approximation by positive operators (41A36) Approximation by other special function classes (41A30)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- The essential order of approximation for neural networks
- The approximation operators with sigmoidal functions
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- Uniform approximation by neural networks
- Rate of convergence of some neural network operators to the unit-univariate case
- Rate of convergence of some multivariate neural network operators to the unit
- Multilayer feedforward networks are universal approximators
- An approximation by neural networks with a fixed weight
- Degree of approximation by neural and translation networks with a single hidden layer
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by perturbed neural network operators
- A logical calculus of the ideas immanent in nervous activity
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Approximations by multivariate perturbed neural network operators