Modified neural network operators and their convergence properties with summability methods
From MaRDI portal
Publication:2185008
DOI10.1007/s13398-020-00860-0zbMath1440.41011OpenAlexW3027168902MaRDI QIDQ2185008
Publication date: 4 June 2020
Published in: Revista de la Real Academia de Ciencias Exactas, Físicas y Naturales. Serie A: Matemáticas. RACSAM (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s13398-020-00860-0
uniform approximationstrong summabilitysummability methodsneural network operatorsBell-shaped function
Related Items (8)
DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS ⋮ Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions ⋮ Multivariate approximation in φ\varphi-variation for nonlinear integral operators via summability methods ⋮ Regular summability methods in the approximation by max-min operators ⋮ Some density results by deep Kantorovich type neural network operators ⋮ Complex Shepard operators and their summability ⋮ Analytical Meir-Keeler type contraction mappings and equivalent characterizations ⋮ Nonlinear approximation in \(N\)-dimension with the help of summability methods
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by neural networks with sigmoidal functions
- Strong summation process in \(L_p\) spaces
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- Multivariate hyperbolic tangent neural network approximation
- Multivariate sigmoidal neural network approximation
- Interpolation and rates of convergence for a class of neural networks
- The approximation operators with sigmoidal functions
- Summation process of positive linear operators
- Sequence transformations and their applications
- Acceleration by subsequence transformations
- On summability and positive linear operators
- Quantitative results on almost convergence of a sequence of positive linear operators
- Uniform approximation by neural networks
- Rate of convergence of some neural network operators to the unit-univariate case
- Approximation by neural networks with a bounded number of nodes at each level
- Rate of convergence of some multivariate neural network operators to the unit
- A summability process on Baskakov-type approximation
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Estimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functions
- Degree of approximation by neural and translation networks with a single hidden layer
- Approximation with neural networks activated by ramp sigmoids
- Summability process by Mastroianni operators and their generalizations
- Approximation by max-min operators: a general theory and its applications
- Convergence of a family of neural network operators of the Kantorovich type
- Strong and ordinary summability
- Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
- Convergence for a family of neural network operators in Orlicz spaces
- Acceleration of Linear and Logarithmic Convergence
- Convergence results for a family of Kantorovich max-product neural network operators in a multivariate setting
- Summability on Mellin-type nonlinear integral operators
- Approximation by nonlinear integral operators via summability process
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Modified neural network operators and their convergence properties with summability methods