DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS
From MaRDI portal
Publication:5052610
DOI10.3846/mma.2022.15974OpenAlexW4308939445MaRDI QIDQ5052610
Publication date: 25 November 2022
Published in: Mathematical Modelling and Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3846/mma.2022.15974
sigmoidal functionsdensity resultsdeep neural networksneural network operatorsReLU activation functionRePUs activation functions
Artificial neural networks and deep learning (68T07) Rate of convergence, degree of approximation (41A25)
Related Items
Approximation error for neural network operators by an averaged modulus of smoothness ⋮ Some density results by deep Kantorovich type neural network operators ⋮ Approximation by exponential-type polynomials
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Approximative compactness of linear combinations of characteristic functions
- The approximation operators with sigmoidal functions
- Approximation by superposition of sigmoidal and radial basis functions
- On simultaneous approximations by radial basis function neural networks
- Rate of convergence of some neural network operators to the unit-univariate case
- Neural network operators: constructive interpolation of multivariate functions
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Multilayer feedforward networks are universal approximators
- Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators
- Approximation by exponential sampling type neural network operators
- Optimal approximation rate of ReLU networks in terms of width and depth
- Modified neural network operators and their convergence properties with summability methods
- Theory of deep convolutional neural networks: downsampling
- Interpolation by neural network operators activated by ramp functions
- Approximation by series of sigmoidal functions with applications to neural networks
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- Intelligent systems II. Complete approximation by neural network operators
- Simultaneous approximations of multivariate functions and their derivatives by neural networks with one hidden layer
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by truncated max‐product operators of Kantorovich‐type based on generalized (ϕ,ψ)‐kernels
- Approximation by superpositions of a sigmoidal function
- Fractional type multivariate neural network operators