On the tractability of multivariate integration and approximation by neural networks
From MaRDI portal
Publication:876822
DOI10.1016/j.jco.2003.11.004zbMath1344.65036OpenAlexW2077597269MaRDI QIDQ876822
Publication date: 30 April 2007
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2003.11.004
Related Items (24)
Approximation of Sobolev classes by polynomials and ridge functions ⋮ Entropy, Randomization, Derandomization, and Discrepancy ⋮ Suboptimal solutions to dynamic optimization problems via approximations of the policy functions ⋮ Two fast and accurate heuristic RBF learning rules for data classification ⋮ A deep network construction that adapts to intrinsic dimensionality beyond the domain ⋮ On energy, discrepancy and group invariant measures on measurable subsets of Euclidean space ⋮ Complexity estimates based on integral transforms induced by computational units ⋮ On the problem of parameter estimation in exponential sums ⋮ Accuracy of approximations of solutions to Fredholm equations by kernel methods ⋮ A tribute to Géza Freud ⋮ Negative results for approximation using single layer and multilayer feedforward neural networks ⋮ Approximation by neural networks with weights varying on a finite set of directions ⋮ Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions ⋮ A low discrepancy sequence on graphs ⋮ Provable approximation properties for deep neural networks ⋮ Almost optimal estimates for approximation and learning by radial basis function networks ⋮ Bracketing numbers for axis-parallel boxes and applications to geometric discrepancy ⋮ Estimates of variation with respect to a set and applications to optimization problems ⋮ Weighted quadrature formulas and approximation by zonal function networks on the sphere ⋮ Complexity of Gaussian-radial-basis networks approximating smooth functions ⋮ Super-resolution meets machine learning: approximation of measures ⋮ Bounds and constructions for the star-discrepancy via \(\delta\)-covers ⋮ Correlations of random classifiers on large data sets ⋮ The Barron space and the flow-induced function spaces for neural network models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Uniform approximation by neural networks
- Random approximants and neural networks
- Weak convergence and empirical processes. With applications to statistics
- Universal approximation bounds for superpositions of a sigmoidal function
- Dimension-independent bounds on the degree of approximation by neural networks
- The inverse of the star-discrepancy depends linearly on the dimension
- Bounds on rates of variable-basis and neural-network approximation
- Comparison of worst case errors in linear and neural network approximation
- On tractability of weighted integration over bounded and unbounded regions in ℝ^{𝕤}
- Convergence of stochastic processes
- Intractability results for integration and discrepancy
This page was built for publication: On the tractability of multivariate integration and approximation by neural networks