Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
From MaRDI portal
Publication:2630379
DOI10.1016/j.jat.2016.05.001zbMath1350.41001OpenAlexW2406122930MaRDI QIDQ2630379
Gianluca Vinti, Danilo Costarelli
Publication date: 27 July 2016
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2016.05.001
order of approximationuniform approximationsigmoidal functionsmax-product operatorsneural networks operators
Related Items (41)
Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces ⋮ Approximation by max-product neural network operators of Kantorovich type ⋮ A characterization of the convergence in variation for the generalized sampling series ⋮ Solving numerically nonlinear systems of balance laws by multivariate sigmoidal functions approximation ⋮ Approximation by truncated Lupaş operators of max-product kind ⋮ Solving polynomial systems using a fast adaptive back propagation-type neural network algorithm ⋮ Approximation of discontinuous signals by sampling Kantorovich series ⋮ On the approximation by single hidden layer feedforward neural networks with fixed weights ⋮ Probabilistic lower bounds for approximation by shallow perceptron networks ⋮ Event-triggered \(\mathcal H_\infty\) state estimation for semi-Markov jumping discrete-time neural networks with quantization ⋮ Saturation classes for MAX-product neural network operators activated by sigmoidal functions ⋮ Modified neural network operators and their convergence properties with summability methods ⋮ Pointwise and uniform approximation by multivariate neural network operators of the max-product type ⋮ Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions ⋮ Approximation error for neural network operators by an averaged modulus of smoothness ⋮ Approximation by multivariate max-product Kantorovich-type operators and learning rates of least-squares regularized regression ⋮ Detection of thermal bridges from thermographic images by means of image processing approximation algorithms ⋮ Approximation by pseudo-linear discrete operators ⋮ Fractional type multivariate neural network operators ⋮ Convergence for a family of neural network operators in Orlicz spaces ⋮ Convergence in variation for the multidimensional generalized sampling series and applications to smoothing for digital image processing ⋮ Approximation by max-min operators: a general theory and its applications ⋮ Estimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functions ⋮ Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators ⋮ The max-product generalized sampling operators: convergence and quantitative estimates ⋮ Quantitative estimates involving K-functionals for neural network-type operators ⋮ Limitations of shallow nets approximation ⋮ A comparison between the sampling Kantorovich algorithm for digital image processing with some interpolation and quasi-interpolation methods ⋮ Fractional type multivariate sampling operators ⋮ An Inverse Result of Approximation by Sampling Kantorovich Series ⋮ Extension of saturation theorems for the sampling Kantorovich operators ⋮ Approximate solutions of Volterra integral equations by an interpolation method based on ramp functions ⋮ Approximation by mixed operators of max-product-Choquet type ⋮ Approximation by max-product operators of Kantorovich type ⋮ On approximation by max-product Shepard operators ⋮ New approximation properties of the Bernstein max-min operators and Bernstein max-product operators ⋮ Approximation by max-product sampling Kantorovich operators with generalized kernels ⋮ Abstract integration with respect to measures and applications to modular convergence in vector lattice setting ⋮ Max-product type multivariate sampling operators and applications to image processing ⋮ Connections between the Approximation Orders of Positive Linear Operators and Their Max-Product Counterparts ⋮ Approximation by Kantorovich-type max-min operators and its applications
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Solving Volterra integral equations of the second kind by sigmoidal functions approximation
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- On the approximation by neural networks with bounded number of neurons in hidden layers
- Saturation and inverse results for the Bernstein max-product operator
- Order of approximation for sampling Kantorovich operators
- Rate of approximation for multivariate sampling Kantorovich operators on some functions spaces
- Intelligent systems. Approximation by artificial neural networks
- A unifying approach to convergence of linear sampling type operators in Orlicz spaces
- Aspects of de La Vallée Poussin's work in approximation and its influence
- Constructive approximate interpolation by neural networks
- Convergence and rate of approximation for linear integral operators in \(BV^{\varphi }\)-spaces in multidimensional setting
- The approximation operators with sigmoidal functions
- Approximation by means of nonlinear Kantorovich sampling type operators in Orlicz spaces
- Approximation of continuous and discontinuous functions by generalized sampling series
- Feedforward nets for interpolation and classification
- Uniform approximation by neural networks
- Rate of convergence of some neural network operators to the unit-univariate case
- Approximation by neural networks with a bounded number of nodes at each level
- Neural network operators: constructive interpolation of multivariate functions
- Nonlinearity creates linear independence
- Approximation with neural networks activated by ramp sigmoids
- Interpolation by neural network operators activated by ramp functions
- Convergence of a family of neural network operators of the Kantorovich type
- Approximation by series of sigmoidal functions with applications to neural networks
- A collocation method for solving nonlinear Volterra integro-differential equations of neutral type by sigmoidal functions
- Approximation by neural networks and learning theory
- Approximation by Nonlinear Multivariate Sampling Kantorovich Type Operators and Applications to Image Processing
- Approximation Results for a General Class of Kantorovich Type Operators
- Degree of Approximation for Nonlinear Multivariate Sampling Kantorovich Operators on Some Functions Spaces
- On pointwise convergence of linear integral operators with homogeneous kernels
- Approximation with Respect to Goffman–Serrin Variation by Means of Non-Convolution Type Integral Operators
- Universal approximation bounds for superpositions of a sigmoidal function
- A general approach to the convergence theorems of generalized sampling series
- Prediction by Samples From the Past With Error Estimates Covering Discontinuous Signals
- Constructive Approximation by Superposition of Sigmoidal Functions
- Applications of sampling Kantorovich operators to thermographic images for seismic engineering
- Approximation by superpositions of a sigmoidal function
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Max-product neural network and quasi-interpolation operators activated by sigmoidal functions