Approximation error for neural network operators by an averaged modulus of smoothness
From MaRDI portal
Publication:6093307
DOI10.1016/j.jat.2023.105944OpenAlexW4385128583MaRDI QIDQ6093307
Publication date: 6 September 2023
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2023.105944
sigmoidal functionsquantitative estimatesaveraged moduli of smoothnessneural network operatorsReLU activation functionRePUs activation function
Artificial neural networks and deep learning (68T07) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New estimates for the differences of positive linear operators
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Approximative compactness of linear combinations of characteristic functions
- Approximation error of the Whittaker cardinal series in terms of an averaged modulus of smoothness covering discontinuous signals
- The approximation operators with sigmoidal functions
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators
- Approximation rates for neural networks with encodable weights in smoothness spaces
- Nonlinear approximation and (deep) ReLU networks
- Neural network identifiability for a family of sigmoidal nonlinearities
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions
- Theory of deep convolutional neural networks: downsampling
- Approximating smooth functions by deep neural networks with sigmoid activation function
- Interpolation by neural network operators activated by ramp functions
- How sharp is the Jensen inequality?
- Universality of deep convolutional neural networks
- Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
- DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS
- Some approximation properties by a class of bivariate operators
- Neural Approximations for Optimal Control and Decision
- Simultaneous approximation of functions and their derivatives on the whole real axis
- Approximation by superpositions of a sigmoidal function