Approximation by exponential sampling type neural network operators
DOI10.1007/s13324-021-00543-yzbMath1468.94351arXiv1911.05587OpenAlexW3163514653WikidataQ115601044 ScholiaQ115601044MaRDI QIDQ2037365
Publication date: 30 June 2021
Published in: Analysis and Mathematical Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.05587
order of convergencelogarithmic modulus of continuityexponential sampling seriesneural network operators
Linear operator approximation theory (47A58) Rate of convergence, degree of approximation (41A25) Approximation by operators (in particular, by integral operators) (41A35) Sampling theory in information and communication theory (94A20)
Related Items (6)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the Paley-Wiener theorem in the Mellin transform setting
- Solving Volterra integral equations of the second kind by sigmoidal functions approximation
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- Intelligent systems. Approximation by artificial neural networks
- Univariate hyperbolic tangent neural network approximation
- Multivariate hyperbolic tangent neural network approximation
- Multivariate sigmoidal neural network approximation
- Direct and inverse results for Kantorovich type exponential sampling series
- Approximation by Ridge functions and neural networks with one hidden layer
- Advanced topics in Shannon sampling and interpolation theory
- Rate of convergence of some neural network operators to the unit-univariate case
- A direct approach to the Mellin transform
- Neural network operators: constructive interpolation of multivariate functions
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Multilayer feedforward networks are universal approximators
- An approximation by neural networks with a fixed weight
- Approximation with neural networks activated by ramp sigmoids
- Convergence of a family of neural network operators of the Kantorovich type
- Approximation by series of sigmoidal functions with applications to neural networks
- Exponential sampling series: convergence in Mellin-Lebesgue spaces
- The Mellin–Parseval formula and its interconnections with the exponential sampling theorem of optical physics
- On Mellin convolution operators: a direct approach to the asymptotic formulae
- A generalization of the exponential sampling series and its approximation properties
- Quantitative estimates involving K-functionals for neural network-type operators
- Exponential-sampling method for Laplace and other dilationally invariant transforms: II. Examples in photon correlation spectroscopy and Fraunhofer diffraction
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Approximation by exponential sampling type neural network operators