Dimension-independent bounds on the degree of approximation by neural networks
From MaRDI portal
Publication:4320794
DOI10.1147/rd.383.0277zbMath0823.41012OpenAlexW2075407161MaRDI QIDQ4320794
Hrushikesh N. Mhaskar, Charles A. Micchelli
Publication date: 22 October 1995
Published in: IBM Journal of Research and Development (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/9bea13e7f1d60ade14a566e8f28b622c0a3c5c4c
Neural networks for/in biological studies, artificial life and related topics (92B20) Spline approximation (41A15)
Related Items (14)
Lower estimation of approximation rate for neural networks ⋮ A Sobolev-type upper bound for rates of approximation by linear combinations of Heaviside plane waves ⋮ On the tractability of multivariate integration and approximation by neural networks ⋮ Limitations of the approximation capabilities of neural networks with one hidden layer ⋮ Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions ⋮ Applications of classical approximation theory to periodic basis function networks and computational harmonic analysis ⋮ Approximation rates for neural networks with general activation functions ⋮ Finite Neuron Method and Convergence Analysis ⋮ New study on neural networks: the essential order of approximation ⋮ The essential order of approximation for nearly exponential type neural networks ⋮ Complexity of Gaussian-radial-basis networks approximating smooth functions ⋮ Linearized two-layers neural networks in high dimension ⋮ On best approximation by ridge functions ⋮ Rates of minimization of error functionals over Boolean variable-basis functions
This page was built for publication: Dimension-independent bounds on the degree of approximation by neural networks