Universal approximation bounds for superpositions of a sigmoidal function
From MaRDI portal
Publication:4277151
DOI10.1109/18.256500zbMATH Open0818.68126OpenAlexW2166116275MaRDI QIDQ4277151FDOQ4277151
Authors: Andrew R. Barron
Publication date: 7 February 1994
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/04113e8974341f97258800126d05fd8df2751b7e
Recommendations
- Approximation by superpositions of a sigmoidal function
- Approximation and estimation bounds for artificial neural networks
- Estimation of approximating rate for neural network in \(L^p_w\) spaces
- Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives
- Approximation Capability of Layered Neural Networks with Sigmoid Units on Two Layers
Cited In (only showing first 100 items - show all)
- A class \(+1\) sigmoidal activation functions for FFANNs
- Optimal deep neural networks by maximization of the approximation power
- Deep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional Problems
- Restricted polynomial regression
- Robust control of dynamical systems using neural networks with input–output feedback linearization
- Approximate models for nonlinear dynamical systems and their generalization properties
- Predictive neuro-control of uncertain systems: Design and use of a neuro-optimizer
- On universal estimators in learning theory
- Some extensions of radial basis functions and their applications in artificial intelligence
- Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces
- On the achievability of blind source separation for high-dimensional nonlinear source mixtures
- Solving numerically nonlinear systems of balance laws by multivariate sigmoidal functions approximation
- Convex polynomial and ridge approximation of Lipschitz functions in \(\mathbb R^d\)
- Almost optimal estimates for approximation and learning by radial basis function networks
- Selection dynamics for deep neural networks
- Convergence for a family of neural network operators in Orlicz spaces
- A note on error bounds for function approximation using nonlinear networks
- The complexity of model classes, and smoothing noisy data
- High-dimensional dynamics of generalization error in neural networks
- Neural network interpolation operators optimized by Lagrange polynomial
- Deep network with approximation error being reciprocal of width to power of square root of depth
- Extension of localised approximation by neural networks
- Adaptive control of nonlinear dynamic systems using \(\theta\)-adaptive neural networks
- Convergence of a least-squares Monte Carlo algorithm for American option pricing with dependent sample data
- Approximation properties of local bases assembled from neural network transfer functions
- Annealing stochastic approximation Monte Carlo algorithm for neural network training
- A note on error bounds for approximation in inner product spaces
- Stable adaptive neuro-control design via Lyapunov function derivative estimation
- A note on universal approximation by hierarchical fuzzy systems
- Pointwise and uniform approximation by multivariate neural network operators of the max-product type
- Convergence analysis of convex incremental neural networks
- Global Mittag-Leffler stability of complex valued fractional-order neural network with discrete and distributed delays
- Fusion methods for multiple sensor systems with unknown error densities
- Stochastically ordered multiple regression
- Title not available (Why is that?)
- Approximation-based fixed-time adaptive tracking control for a class of uncertain nonlinear pure-feedback systems
- Neural network approximation
- Complexity of gene circuits, Pfaffian functions and the morphogenesis problem.
- Neural networks and nonlinear statistical methods: An application to the modelling of price-quality relationships
- Models of knowing and the investigation of dynamical systems
- Title not available (Why is that?)
- Convergence results for a family of Kantorovich max-product neural network operators in a multivariate setting
- Minimization of Error Functionals over Perceptron Networks
- Approximation by ridge function fields over compact sets
- Solving Fredholm integral equations using deep learning
- Rates of minimization of error functionals over Boolean variable-basis functions
- Gabor neural networks with proven approximation properties
- Algorithms and complexity in biological pattern formation problems
- Nonlinear stable adaptive control based upon Elman networks
- Semi-nonparametric approximation and index options
- Modelling the dynamics of nonlinear time series using canonical variate analysis
- Deep ReLU network expression rates for option prices in high-dimensional, exponential Lévy models
- Neural network-based variational methods for solving quadratic porous medium equations in high dimensions
- Correlations of random classifiers on large data sets
- Standard representation and unified stability analysis for dynamic artificial neural network models
- A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
- Generalization bounds for sparse random feature expansions
- Understanding neural networks with reproducing kernel Banach spaces
- DNN expression rate analysis of high-dimensional PDEs: application to option pricing
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- Representation formulas and pointwise properties for Barron functions
- Depth separations in neural networks: what is actually being separated?
- The Barron space and the flow-induced function spaces for neural network models
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Nonconvex regularization for sparse neural networks
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- The Random Feature Model for Input-Output Maps between Banach Spaces
- Learning nonlinear state-space models using autoencoders
- Learning the mapping \(\mathbf{x}\mapsto \sum\limits_{i=1}^d x_i^2\): the cost of finding the needle in a haystack
- Title not available (Why is that?)
- Deep Network Approximation for Smooth Functions
- Effect of depth and width on local minima in deep learning
- Structure probing neural network deflation
- Generalization Error Analysis of Neural Networks with Gradient Based Regularization
- Estimation of a regression function on a manifold by fully connected deep neural networks
- SelectNet: self-paced learning for high-dimensional partial differential equations
- Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
- Error estimates for deep learning methods in fluid dynamics
- Neural network interpolation operators activated by smooth ramp functions
- A deep Fourier residual method for solving PDEs using neural networks
- Fractional type multivariate neural network operators
- An Augmented Lagrangian Deep Learning Method for Variational Problems with Essential Boundary Conditions
- Cornell potential: a neural network approach
- Theory of deep convolutional neural networks. III: Approximating radial functions
- Approximation by finite mixtures of continuous density functions that vanish at infinity
- Approximation Analysis of Convolutional Neural Networks
- Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
- Machine learning based data retrieval for inverse scattering problems with incomplete data
- Neural network approximation: three hidden layers are enough
- Approximating functions with multi-features by deep convolutional neural networks
- Approximation of nonlinear functionals using deep ReLU networks
- Approximation of smooth functionals using deep ReLU networks
- Convergence of Physics-Informed Neural Networks Applied to Linear Second-Order Elliptic Interface Problems
- Theory of deep convolutional neural networks: downsampling
- On the rate of convergence of fully connected deep neural network regression estimates
- Title not available (Why is that?)
- Transport analysis of infinitely deep neural network
- Finite neuron method and convergence analysis
- Theory of deep convolutional neural networks. II: Spherical analysis
- High-dimensional distribution generation through deep neural networks
This page was built for publication: Universal approximation bounds for superpositions of a sigmoidal function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4277151)