Approximation results for neural network operators activated by sigmoidal functions

From MaRDI portal
Publication:459444

DOI10.1016/j.neunet.2013.03.015zbMath1296.41017OpenAlexW1986164709WikidataQ51230478 ScholiaQ51230478MaRDI QIDQ459444

Renato Spigler, Danilo Costarelli

Publication date: 9 October 2014

Published in: Neural Networks (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/j.neunet.2013.03.015




Related Items (47)

On the Hausdorff distance between the Heaviside step function and Verhulst logistic functionApproximation by network operators with logistic activation functionsMax-product neural network and quasi-interpolation operators activated by sigmoidal functionsDENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTSDegree of Approximation for Nonlinear Multivariate Sampling Kantorovich Operators on Some Functions SpacesMultivariate neural network interpolation operatorsSolving numerically nonlinear systems of balance laws by multivariate sigmoidal functions approximationNeural network operators: constructive interpolation of multivariate functionsVoronovskaja type theorems and high-order convergence neural network operators with sigmoidal functionsAsymptotic expansions and Voronovskaja type theorems for the multivariate neural network operatorsModified neural network operators and their convergence properties with summability methodsQuantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functionsOn the approximation of functions by tanh neural networksOn sharpness of error bounds for multivariate neural network approximationOn decision regions of narrow deep neural networksApproximation by a class of neural network operators on scattered dataApproximation error for neural network operators by an averaged modulus of smoothnessHyperbolic tangent like relied Banach space valued neural network multivariate approximationsVoronovskaya Type Asymptotic Expansions for Perturbed Neural Network OperatorsSome density results by deep Kantorovich type neural network operatorsMultiple general sigmoids based Banach space valued neural network multivariate approximationFractional type multivariate neural network operatorsThe construction and approximation of feedforward neural network with hyperbolic tangent functionApproximation error of single hidden layer neural networks with fixed weightsOrder of approximation for exponential sampling type neural network operatorsRichards's curve induced Banach space valued multivariate neural network approximationError Estimation for Approximate Solutions of Delay Volterra Integral EquationsUnnamed ItemMultivariate neural network operators with sigmoidal activation functionsOn the approximation of the step function by some sigmoid functionsRate of approximation for multivariate sampling Kantorovich operators on some functions spacesModeling of complex dynamic systems using differential neural networks with the incorporation of a priori knowledgeExtreme learning machine collocation for the numerical solution of elliptic PDEs with sharp gradientsInterpolation by neural network operators activated by ramp functionsConvergence of a family of neural network operators of the Kantorovich typeApproximation by series of sigmoidal functions with applications to neural networksAn adaptive learning rate backpropagation‐type neural network for solving n × n systems on nonlinear algebraic equationsUnnamed ItemApproximation by exponential sampling type neural network operatorsApproximation rates for neural networks with encodable weights in smoothness spacesQuantitative approximation by perturbed Kantorovich-Choquet neural network operatorsOn sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networksApproximate solutions of Volterra integral equations by an interpolation method based on ramp functionsThe construction and approximation of the neural network with two weightsApproximation by perturbed neural network operatorsApproximations by multivariate perturbed neural network operatorsHigh-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions



Cites Work


This page was built for publication: Approximation results for neural network operators activated by sigmoidal functions