On the approximation by neural networks with bounded number of neurons in hidden layers
From MaRDI portal
Publication:483038
DOI10.1016/j.jmaa.2014.03.092zbMath1303.41010OpenAlexW1978397327MaRDI QIDQ483038
Publication date: 15 December 2014
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmaa.2014.03.092
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20) Multidimensional problems (41A63) Approximation by other special function classes (41A30)
Related Items (21)
Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces ⋮ Approximation by max-product neural network operators of Kantorovich type ⋮ Max-product neural network and quasi-interpolation operators activated by sigmoidal functions ⋮ Approximation by ridge functions and neural networks with a bounded number of neurons ⋮ The universal approximation capabilities of cylindrical approximate identity neural networks ⋮ Almost everywhere approximation capabilities of double Mellin approximate identity neural networks ⋮ Neural network operators: constructive interpolation of multivariate functions ⋮ Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions ⋮ Asymptotic expansions and Voronovskaja type theorems for the multivariate neural network operators ⋮ Saturation classes for MAX-product neural network operators activated by sigmoidal functions ⋮ Pointwise and uniform approximation by multivariate neural network operators of the max-product type ⋮ Approximation capabilities of neural networks on unbounded domains ⋮ Approximating smooth and sparse functions by deep neural networks: optimal approximation rates and saturation ⋮ Construction of Optimal Feedback for Zooplankton Diel Vertical Migration ⋮ Convergence for a family of neural network operators in Orlicz spaces ⋮ A three layer neural network can represent any multivariate function ⋮ A Single Hidden Layer Feedforward Network with Only One Neuron in the Hidden Layer Can Approximate Any Univariate Function ⋮ Negative results for approximation using single layer and multilayer feedforward neural networks ⋮ An adaptive learning rate backpropagation‐type neural network for solving n × n systems on nonlinear algebraic equations ⋮ Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation ⋮ Limitations of shallow nets approximation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by neural networks with weights varying on a finite set of directions
- Dimension, superposition of functions and separation of points, in compact metric spaces
- Approximation by Ridge functions and neural networks with one hidden layer
- Sur le théorème de superposition de Kolmogorov
- Uniformly separating families of functions
- Lower bounds for approximation by MLP neural networks
- Approximation and estimation bounds for artificial neural networks
- Fundamentality of ridge functions
- On the representation by linear superpositions
- An improvement in the superposition theorem of Kolmogorov
- Metric Entropy, Widths, and Superpositions of Functions
- On functions of three variables
- Dimension of metric spaces and Hilbert’s problem 13
- On the Structure of Continuous Functions of Several Variables
- Approximation by superpositions of a sigmoidal function
This page was built for publication: On the approximation by neural networks with bounded number of neurons in hidden layers