Max-product neural network and quasi-interpolation operators activated by sigmoidal functions

From MaRDI portal
Publication:2630379


DOI10.1016/j.jat.2016.05.001zbMath1350.41001OpenAlexW2406122930MaRDI QIDQ2630379

Gianluca Vinti, Danilo Costarelli

Publication date: 27 July 2016

Published in: Journal of Approximation Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/j.jat.2016.05.001



Related Items

Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces, Approximation by max-product neural network operators of Kantorovich type, A characterization of the convergence in variation for the generalized sampling series, Solving numerically nonlinear systems of balance laws by multivariate sigmoidal functions approximation, Approximation by truncated Lupaş operators of max-product kind, Solving polynomial systems using a fast adaptive back propagation-type neural network algorithm, Approximation of discontinuous signals by sampling Kantorovich series, On the approximation by single hidden layer feedforward neural networks with fixed weights, Probabilistic lower bounds for approximation by shallow perceptron networks, Event-triggered \(\mathcal H_\infty\) state estimation for semi-Markov jumping discrete-time neural networks with quantization, Saturation classes for MAX-product neural network operators activated by sigmoidal functions, Modified neural network operators and their convergence properties with summability methods, Pointwise and uniform approximation by multivariate neural network operators of the max-product type, Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions, Approximation error for neural network operators by an averaged modulus of smoothness, Approximation by multivariate max-product Kantorovich-type operators and learning rates of least-squares regularized regression, Detection of thermal bridges from thermographic images by means of image processing approximation algorithms, Approximation by pseudo-linear discrete operators, Fractional type multivariate neural network operators, Convergence for a family of neural network operators in Orlicz spaces, Convergence in variation for the multidimensional generalized sampling series and applications to smoothing for digital image processing, Approximation by max-min operators: a general theory and its applications, Estimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functions, Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators, The max-product generalized sampling operators: convergence and quantitative estimates, Quantitative estimates involving K-functionals for neural network-type operators, Limitations of shallow nets approximation, A comparison between the sampling Kantorovich algorithm for digital image processing with some interpolation and quasi-interpolation methods, Fractional type multivariate sampling operators, An Inverse Result of Approximation by Sampling Kantorovich Series, Extension of saturation theorems for the sampling Kantorovich operators, Approximate solutions of Volterra integral equations by an interpolation method based on ramp functions, Approximation by mixed operators of max-product-Choquet type, Approximation by max-product operators of Kantorovich type, On approximation by max-product Shepard operators, New approximation properties of the Bernstein max-min operators and Bernstein max-product operators, Approximation by max-product sampling Kantorovich operators with generalized kernels, Abstract integration with respect to measures and applications to modular convergence in vector lattice setting, Max-product type multivariate sampling operators and applications to image processing, Connections between the Approximation Orders of Positive Linear Operators and Their Max-Product Counterparts, Approximation by Kantorovich-type max-min operators and its applications



Cites Work