Complexity estimates based on integral transforms induced by computational units
From MaRDI portal
Publication:1941596
DOI10.1016/j.neunet.2012.05.002zbMath1261.44002OpenAlexW2101695397WikidataQ51356558 ScholiaQ51356558MaRDI QIDQ1941596
Publication date: 13 March 2013
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2012.05.002
integral transformsneural networksapproximation from a dictionaryestimates of model complexitynorms induced by computational units
Neural networks for/in biological studies, artificial life and related topics (92B20) Special integral transforms (Legendre, Hilbert, etc.) (44A15)
Related Items
Neural network with unbounded activation functions is universal approximator ⋮ Neural network operators: constructive interpolation of multivariate functions ⋮ Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation ⋮ A neural network algorithm to pattern recognition in inverse problems ⋮ Unnamed Item ⋮ Approximation results for neural network operators activated by sigmoidal functions ⋮ Multivariate neural network operators with sigmoidal activation functions ⋮ Integral representations of shallow neural network with Rectified Power Unit activation function ⋮ Interpolation by neural network operators activated by ramp functions ⋮ Complexity of Shallow Networks Representing Finite Mappings ⋮ Some implications of interval approach to dimension for network complexity ⋮ Unnamed Item ⋮ Correlations of random classifiers on large data sets
Cites Work
- On the tractability of multivariate integration and approximation by neural networks
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Rates of convex approximation in non-Hilbert spaces
- Approximation and learning of convex superpositions
- Weighted quadrature formulas and approximation by zonal function networks on the sphere
- A Sobolev-type upper bound for rates of approximation by linear combinations of Heaviside plane waves
- An Integral Upper Bound for Neural Network Approximation
- Kernel techniques: From machine learning to meshless methods
- Minimization of Error Functionals over Perceptron Networks
- Integral combinations of Heavisides
- Geometric Upper Bounds on Rates of Variable-Basis Approximation
- Real Interpolation of Sobolev Spaces on Subdomains of Rn
- Feedforward Neural Network Methodology
- Universal approximation bounds for superpositions of a sigmoidal function
- Comparison of worst case errors in linear and neural network approximation
- Error Estimates for Approximate Optimization by the Extended Ritz Method
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Complexity estimates based on integral transforms induced by computational units