Probabilistic lower bounds for approximation by shallow perceptron networks
DOI10.1016/j.neunet.2017.04.003zbMath1437.68063OpenAlexW2605622124WikidataQ47855272 ScholiaQ47855272MaRDI QIDQ2181058
Marcello Sanguineti, Vera Kurková
Publication date: 18 May 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2017.04.003
model complexityChernoff-Hoeffding boundsperceptronsshallow networkslower bounds on approximation rates
Artificial neural networks and deep learning (68T07) Approximation algorithms (68W25) Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87) Biologically inspired models of computation (DNA computing, membrane computing, etc.) (68Q07)
Related Items (6)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the curse of dimensionality in the Ritz method
- Approximation by max-product neural network operators of Kantorovich type
- A comparison between fixed-basis and variable-basis schemes for function approximation and functional optimization
- Can dictionary-based computational models outperform the best linear ones?
- Intelligent systems. Approximation by artificial neural networks
- Some comparisons of complexity in dictionary-based and linear computational models
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- Approximation by neural networks with a bounded number of nodes at each level
- Geometry and topology of continuous best and near best approximations
- Neural network operators: constructive interpolation of multivariate functions
- Optimal nonlinear approximation
- On the near optimality of the stochastic approximation of smooth functions by neural networks
- Approximation with random bases: pro et contra
- Pointwise and uniform approximation by multivariate neural network operators of the max-product type
- Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
- Discrete Mathematics of Neural Networks
- Universal Approximation by Ridge Computational Models and Neural Networks: A Survey
- Learning Deep Architectures for AI
- Bounds on rates of variable-basis and neural-network approximation
- Comparison of worst case errors in linear and neural network approximation
- Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks
- Probability Inequalities for Sums of Bounded Random Variables
- A Fast Learning Algorithm for Deep Belief Nets
- Enumeration of Seven-Argument Threshold Functions
- Continuity of approximation by neural networks in \(L_p\) spaces
This page was built for publication: Probabilistic lower bounds for approximation by shallow perceptron networks