Tractability of the approximation of high-dimensional rank one tensors
From MaRDI portal
Publication:5962914
Abstract: We study the approximation of high-dimensional rank one tensors using point evaluations and consider deterministic as well as randomized algorithms. We prove that for certain parameters (smoothness and norm of the th derivative) this problem is intractable while for other parameters the problem is tractable and the complexity is only polynomial in the dimension for every fixed . For randomized algorithms we completely characterize the set of parameters that lead to easy or difficult problems, respectively. In the "difficult" case we modify the class to obtain a tractable problem: The problem gets tractable with a polynomial (in the dimension) complexity if the support of the function is not too small.
Recommendations
- Approximation of high-dimensional rank one tensors
- Recovery algorithms for high-dimensional rank one tensors
- On tractability of approximation in special function spaces
- Generalized tractability for multivariate problems. I: Linear tensor product problems and linear information
- Generalized tractability for multivariate problems. II: Linear tensor product problems, linear information, and unrestricted tractability
Cites work
- Approximation of high-dimensional rank one tensors
- Approximation of infinitely differentiable multivariate functions is intractable
- Deterministic and stochastic error bounds in numerical analysis
- Entropy and sampling numbers of classes of ridge functions
- Learnability and the Vapnik-Chervonenkis dimension
- On the largest empty axis-parallel box amidst \(n\) points
- On the power of adaption
- Quasi-Monte-Carlo methods and the dispersion of point sequences
- The curse of dimensionality for numerical integration of smooth functions
- Tractability of multivariate problems. Volume I: Linear information
- Tractability of multivariate problems. Volume II: Standard information for functionals.
- Tractability of multivariate problems. Volume III: Standard information for operators
Cited in
(18)- Recovery algorithms for high-dimensional rank one tensors
- An upper bound of the minimal dispersion via delta covers
- The minimal \(k\)-dispersion of point sets in high dimensions
- Tractability of sampling recovery on unweighted function classes
- On the size of the largest empty box amidst a point set
- On the dispersion of sparse grids
- A note on minimal dispersion of point sets in the unit cube
- On optimal approximation in periodic Besov spaces
- An upper bound on the minimal dispersion
- \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling
- Transformed rank-1 lattices for high-dimensional approximation
- Tensor rank is hard to approximate
- Minimal dispersion of large volume boxes in the cube
- Approximation of high-dimensional rank one tensors
- Computing f‐divergences and distances of high‐dimensional probability density functions
- Deterministic constructions of high-dimensional sets with small dispersion
- Expected dispersion of uniformly distributed points
- The recovery of ridge functions on the hypercube suffers from the curse of dimensionality
This page was built for publication: Tractability of the approximation of high-dimensional rank one tensors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5962914)