Neural network approximation of continuous functions in high dimensions with applications to inverse problems
From MaRDI portal
Publication:6056231
DOI10.1016/j.cam.2023.115557zbMath1527.41002arXiv2208.13305OpenAlexW4386865483MaRDI QIDQ6056231
Mark A. Iwen, Santhosh Karnik, Rongrong Wang
Publication date: 30 October 2023
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.13305
Artificial neural networks and deep learning (68T07) Approximation by other special function classes (41A30)
Cites Work
- Unnamed Item
- New bounds for circulant Johnson-Lindenstrauss embeddings
- Random projections of smooth manifolds
- Universal approximations of invariant maps by neural networks
- Universality of deep convolutional neural networks
- Blind Deconvolution Using Convex Programming
- On variants of the Johnson–Lindenstrauss lemma
- Deep Convolutional Neural Network for Inverse Problems in Imaging
- High-Dimensional Probability
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- On the extension of Lipschitz, Lipschitz-Hölder continuous, and monotone functions
- Perturbation bounds in connection with singular value decomposition
- Nonparametric regression on low-dimensional manifolds using deep ReLU networks: function approximation and statistical recovery
This page was built for publication: Neural network approximation of continuous functions in high dimensions with applications to inverse problems