Saturation classes for MAX-product neural network operators activated by sigmoidal functions

From MaRDI portal
Publication:1682591

DOI10.1007/s00025-017-0692-6zbMath1376.41014OpenAlexW2614181593MaRDI QIDQ1682591

Gianluca Vinti, Danilo Costarelli

Publication date: 30 November 2017

Published in: Results in Mathematics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s00025-017-0692-6




Related Items (27)

Approximation theorems for a family of multivariate neural network operators in Orlicz-type spacesDENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTSApproximation by truncated Lupaş operators of max-product kindVoronovskaja type theorems and high-order convergence neural network operators with sigmoidal functionsModified neural network operators and their convergence properties with summability methodsNovel bifurcation results for a delayed fractional-order quaternion-valued neural networkNonlinear approximation via compositionsQuantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functionsApproximation error for neural network operators by an averaged modulus of smoothnessSome density results by deep Kantorovich type neural network operatorsApproximation error of single hidden layer neural networks with fixed weightsComputing the Approximation Error for Neural Networks with Weights Varying on Fixed DirectionsSuperposition, reduction of multivariable problems, and approximationEstimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functionsApproximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operatorsDeep Network Approximation Characterized by Number of NeuronsThe max-product generalized sampling operators: convergence and quantitative estimatesAsymptotic expansion for neural network operators of the Kantorovich type and high order of approximationQuantitative estimates involving K-functionals for neural network-type operatorsApproximation by exponential sampling type neural network operatorsConvergence of sampling Kantorovich operators in modular spaces with applicationsExtension of saturation theorems for the sampling Kantorovich operatorsOn sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networksApproximate solutions of Volterra integral equations by an interpolation method based on ramp functionsApproximation by mixed operators of max-product-Choquet typeApproximation by max-product operators of Kantorovich typeConnections between the Approximation Orders of Positive Linear Operators and Their Max-Product Counterparts



Cites Work


This page was built for publication: Saturation classes for MAX-product neural network operators activated by sigmoidal functions