On the tractability of multivariate integration and approximation by neural networks
From MaRDI portal
Publication:876822
DOI10.1016/J.JCO.2003.11.004zbMATH Open1344.65036OpenAlexW2077597269MaRDI QIDQ876822FDOQ876822
Publication date: 30 April 2007
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2003.11.004
Recommendations
- Neural-network approximation of functions of several variables
- Approximations by multivariate perturbed neural network operators
- Neural network modeling of vector multivariable functions in ill-posed approximation problems
- The errors of simultaneous approximation of multivariate functions by neural networks
- An Integral Upper Bound for Neural Network Approximation
- Degree of simultaneous approximation of multivarate functions and their derivatives by neural networks
- Neural networks and the best trigomometric approximation
- Neural networks for optimal approximation of continuous functions in \(\mathbb R^d\)
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- Universal approximation bounds for superpositions of a sigmoidal function
- Comparison of worst case errors in linear and neural network approximation
- Title not available (Why is that?)
- Convergence of stochastic processes
- Title not available (Why is that?)
- Random approximants and neural networks
- Title not available (Why is that?)
- Intractability results for integration and discrepancy
- Uniform approximation by neural networks
- The inverse of the star-discrepancy depends linearly on the dimension
- Bounds on rates of variable-basis and neural-network approximation
- When are integration and discrepancy tractable?
- On tractability of weighted integration over bounded and unbounded regions in ℝ^{𝕤}
- Title not available (Why is that?)
- Dimension-independent bounds on the degree of approximation by neural networks
- Title not available (Why is that?)
Cited In (28)
- A low discrepancy sequence on graphs
- Correlations of random classifiers on large data sets
- Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions
- A tribute to Géza Freud
- The Barron space and the flow-induced function spaces for neural network models
- Negative results for approximation using single layer and multilayer feedforward neural networks
- Super-resolution meets machine learning: approximation of measures
- Suboptimal solutions to dynamic optimization problems via approximations of the policy functions
- Provable approximation properties for deep neural networks
- Almost optimal estimates for approximation and learning by radial basis function networks
- Estimates of variation with respect to a set and applications to optimization problems
- Accuracy of approximations of solutions to Fredholm equations by kernel methods
- Complexity of Gaussian-radial-basis networks approximating smooth functions
- On energy, discrepancy and group invariant measures on measurable subsets of Euclidean space
- Bracketing numbers for axis-parallel boxes and applications to geometric discrepancy
- Weighted variation spaces and approximation by shallow ReLU networks
- Approximation by neural networks with weights varying on a finite set of directions
- Construct theory of hierarchical functional networks and its application in multiple numerical integral
- Bounds and constructions for the star-discrepancy via \(\delta\)-covers
- A deep network construction that adapts to intrinsic dimensionality beyond the domain
- Two fast and accurate heuristic RBF learning rules for data classification
- Complexity estimates based on integral transforms induced by computational units
- Entropy, Randomization, Derandomization, and Discrepancy
- Weighted quadrature formulas and approximation by zonal function networks on the sphere
- Improved bounds for the bracketing number of orthants or revisiting an algorithm of Thiémard to compute bounds for the star discrepancy
- Advanced Monte Carlo Methods to Neural Networks
- On the problem of parameter estimation in exponential sums
- Approximation of Sobolev classes by polynomials and ridge functions
This page was built for publication: On the tractability of multivariate integration and approximation by neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q876822)