A deep network construction that adapts to intrinsic dimensionality beyond the domain
From MaRDI portal
Publication:6054952
DOI10.1016/j.neunet.2021.06.004arXiv2008.02545OpenAlexW3171226399MaRDI QIDQ6054952
Alexander Cloninger, Timo Klock
Publication date: 28 September 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2008.02545
approximation theorycurse of dimensionalitycomposite functionsdeep neural networksnoisy manifold models
Related Items (7)
Shared subspace-based radial basis function neural network for identifying ncRNAs subcellular localization ⋮ Drift estimation for a multi-dimensional diffusion process using deep neural networks ⋮ Deep nonparametric estimation of intrinsic data structures by chart autoencoders: generalization error and robustness ⋮ Side effects of learning from low-dimensional data embedded in a Euclidean space ⋮ Neural network approximation and estimation of classifiers with classification boundary in a Barron class ⋮ Stable recovery of entangled weights: towards robust identification of deep neural networks from minimal samples ⋮ Deep neural networks can stably solve high-dimensional, noisy, non-linear inverse problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Hölder-Lipschitz norms and their duals on spaces with semigroups, with applications to earth mover's distance
- On the tractability of multivariate integration and approximation by neural networks
- Learning and approximation by Gaussians on Riemannian manifolds
- Random projections of smooth manifolds
- Approximation and estimation bounds for artificial neural networks
- Multilayer feedforward networks are universal approximators
- Provable approximation properties for deep neural networks
- Optimal nonlinear approximation
- Optimal global rates of convergence for nonparametric regression
- Approximation properties of a multilayered feedforward artificial neural network
- Limitations of the approximation capabilities of neural networks with one hidden layer
- Theory of deep convolutional neural networks. II: Spherical analysis
- A direct approach for function approximation on data defined manifolds
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Nonlinear approximation via compositions
- Theory of deep convolutional neural networks: downsampling
- Function approximation by deep networks
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- The reach, metric distortion, geodesic convexity and the variation of tangent spaces
- Manifold reconstruction using tangential Delaunay complexes
- Finding the homology of submanifolds with high confidence from random samples
- Deep vs. shallow networks: An approximation theory perspective
- Curvature Measures
- Universal approximation bounds for superpositions of a sigmoidal function
- Neural Networks for Localized Approximation
- Bounds on rates of variable-basis and neural-network approximation
- Comparison of worst case errors in linear and neural network approximation
- High-Dimensional Probability
- A Review on Dimension Reduction
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Deep neural networks for rotation-invariance approximation and learning
- Minimax Manifold Estimation
- Approximation by superpositions of a sigmoidal function
This page was built for publication: A deep network construction that adapts to intrinsic dimensionality beyond the domain