Approximation results by multivariate Kantorovich-type neural network sampling operators in Lebesgue spaces with variable exponents
DOI10.1007/S12215-023-00995-0zbMATH Open1541.41016MaRDI QIDQ6562343FDOQ6562343
Authors: Benali Aharrouch
Publication date: 26 June 2024
Published in: Rendiconti del Circolo Matematico di Palermo (Search for Journal in Brave)
Recommendations
- Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces
- Convergence of a family of neural network operators of the Kantorovich type
- Exponential Sampling Type Kantorovich Max-Product Neural Network Operators
- Convergence for a family of neural network operators in Orlicz spaces
- Degree of approximation by multiple sigmoids Kantorovich-Choquet quasi-interpolation neural network operators
uniform approximationsigmoidal functionsmultivariate neural networks operatorsKantorovich-type operators\(L^{p (\cdot)}\)-approximation
Approximation by other special function classes (41A30) Linear operator approximation theory (47A58) Rate of convergence, degree of approximation (41A25)
Cites Work
- Lebesgue and Sobolev spaces with variable exponents
- Orlicz spaces and modular spaces
- How sharp is the Jensen inequality?
- Title not available (Why is that?)
- On modular spaces
- Title not available (Why is that?)
- The construction and approximation of a class of neural networks operators with Ramp functions
- Approximation results for neural network operators activated by sigmoidal functions
- Approximation by neural networks with a bounded number of nodes at each level
- Convergence of a family of neural network operators of the Kantorovich type
- Multivariate neural network operators with sigmoidal activation functions
- On the approximation by neural networks with bounded number of neurons in hidden layers
- Intelligent systems. Approximation by artificial neural networks
- Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions
- OnLp(x)norms
- Title not available (Why is that?)
- Existence of solutions for a class of degenerate elliptic equations in \(P(x)\)-Sobolev spaces
- Existence of weak and renormalized solutions of degenerated elliptic equation
- Convergence for a family of neural network operators in Orlicz spaces
- Necessary and sufficient condition for multistability of neural networks evolving on a closed hypercube
- Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators
- The errors of approximation for feedforward neural networks in thelpmetric
- Multistability of delayed recurrent neural networks with Mexican hat activation functions
- Existence and regularity of solutions for degenerate elliptic equations with variable growth
- Existence and regularity results for nonlinear and nonhomogeneous elliptic equation
- Hadoop neural network for parallel and distributed feature selection
- DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS
Cited In (1)
This page was built for publication: Approximation results by multivariate Kantorovich-type neural network sampling operators in Lebesgue spaces with variable exponents
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6562343)