Approximation results by multivariate Kantorovich-type neural network sampling operators in Lebesgue spaces with variable exponents
From MaRDI portal
Publication:6562343
Recommendations
- Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces
- Convergence of a family of neural network operators of the Kantorovich type
- Exponential Sampling Type Kantorovich Max-Product Neural Network Operators
- Convergence for a family of neural network operators in Orlicz spaces
- Degree of approximation by multiple sigmoids Kantorovich-Choquet quasi-interpolation neural network operators
Cites work
- scientific article; zbMATH DE number 95260 (Why is no real title available?)
- scientific article; zbMATH DE number 3062378 (Why is no real title available?)
- scientific article; zbMATH DE number 7716011 (Why is no real title available?)
- Approximation by neural networks with a bounded number of nodes at each level
- Approximation results for neural network operators activated by sigmoidal functions
- Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators
- Convergence for a family of neural network operators in Orlicz spaces
- Convergence of a family of neural network operators of the Kantorovich type
- DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS
- Existence and regularity of solutions for degenerate elliptic equations with variable growth
- Existence and regularity results for nonlinear and nonhomogeneous elliptic equation
- Existence of solutions for a class of degenerate elliptic equations in \(P(x)\)-Sobolev spaces
- Existence of weak and renormalized solutions of degenerated elliptic equation
- Hadoop neural network for parallel and distributed feature selection
- How sharp is the Jensen inequality?
- Intelligent systems. Approximation by artificial neural networks
- Lebesgue and Sobolev spaces with variable exponents
- Multistability of delayed recurrent neural networks with Mexican hat activation functions
- Multivariate neural network operators with sigmoidal activation functions
- Necessary and sufficient condition for multistability of neural networks evolving on a closed hypercube
- On modular spaces
- On the approximation by neural networks with bounded number of neurons in hidden layers
- OnLp(x)norms
- Orlicz spaces and modular spaces
- Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions
- The construction and approximation of a class of neural networks operators with Ramp functions
- The errors of approximation for feedforward neural networks in thelpmetric
This page was built for publication: Approximation results by multivariate Kantorovich-type neural network sampling operators in Lebesgue spaces with variable exponents
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6562343)