Saturation classes for MAX-product neural network operators activated by sigmoidal functions
From MaRDI portal
Publication:1682591
DOI10.1007/s00025-017-0692-6zbMath1376.41014OpenAlexW2614181593MaRDI QIDQ1682591
Gianluca Vinti, Danilo Costarelli
Publication date: 30 November 2017
Published in: Results in Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00025-017-0692-6
Linear operator approximation theory (47A58) Interpolation in approximation theory (41A05) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30)
Related Items (27)
Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces ⋮ DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS ⋮ Approximation by truncated Lupaş operators of max-product kind ⋮ Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions ⋮ Modified neural network operators and their convergence properties with summability methods ⋮ Novel bifurcation results for a delayed fractional-order quaternion-valued neural network ⋮ Nonlinear approximation via compositions ⋮ Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions ⋮ Approximation error for neural network operators by an averaged modulus of smoothness ⋮ Some density results by deep Kantorovich type neural network operators ⋮ Approximation error of single hidden layer neural networks with fixed weights ⋮ Computing the Approximation Error for Neural Networks with Weights Varying on Fixed Directions ⋮ Superposition, reduction of multivariable problems, and approximation ⋮ Estimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functions ⋮ Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators ⋮ Deep Network Approximation Characterized by Number of Neurons ⋮ The max-product generalized sampling operators: convergence and quantitative estimates ⋮ Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation ⋮ Quantitative estimates involving K-functionals for neural network-type operators ⋮ Approximation by exponential sampling type neural network operators ⋮ Convergence of sampling Kantorovich operators in modular spaces with applications ⋮ Extension of saturation theorems for the sampling Kantorovich operators ⋮ On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks ⋮ Approximate solutions of Volterra integral equations by an interpolation method based on ramp functions ⋮ Approximation by mixed operators of max-product-Choquet type ⋮ Approximation by max-product operators of Kantorovich type ⋮ Connections between the Approximation Orders of Positive Linear Operators and Their Max-Product Counterparts
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by max-product neural network operators of Kantorovich type
- Solving Volterra integral equations of the second kind by sigmoidal functions approximation
- On the approximation by neural networks with bounded number of neurons in hidden layers
- Saturation and inverse results for the Bernstein max-product operator
- Rate of approximation for multivariate sampling Kantorovich operators on some functions spaces
- Intelligent systems. Approximation by artificial neural networks
- Constructive approximate interpolation by neural networks
- The approximation operators with sigmoidal functions
- Uniform approximation by neural networks
- Approximation by neural networks with a bounded number of nodes at each level
- The universal approximation capabilities of cylindrical approximate identity neural networks
- Neural network operators: constructive interpolation of multivariate functions
- Approximation with neural networks activated by ramp sigmoids
- On the near optimality of the stochastic approximation of smooth functions by neural networks
- Approximation by series of sigmoidal functions with applications to neural networks
- Necessary and sufficient condition for multistability of neural networks evolving on a closed hypercube
- A collocation method for solving nonlinear Volterra integro-differential equations of neutral type by sigmoidal functions
- Approximation by neural networks and learning theory
- Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
- Approximation by Max-Product Type Operators
- Deep vs. shallow networks: An approximation theory perspective
- Approximation Results for a General Class of Kantorovich Type Operators
- Convergence for a family of neural network operators in Orlicz spaces
- Nonlinear integral operators with homogeneous kernels: pointwise approximation theorems
- An Integral Upper Bound for Neural Network Approximation
- Degree of Approximation for Nonlinear Multivariate Sampling Kantorovich Operators on Some Functions Spaces
- Universal approximation bounds for superpositions of a sigmoidal function
- Applications of sampling Kantorovich operators to thermographic images for seismic engineering
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Saturation classes for MAX-product neural network operators activated by sigmoidal functions