Learning ability of interpolating deep convolutional neural networks
From MaRDI portal
Publication:6185680
DOI10.1016/j.acha.2023.101582arXiv2210.14184MaRDI QIDQ6185680
Publication date: 30 January 2024
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2210.14184
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Entropy and the combinatorial dimension
- Linear processes in function spaces. Theory and applications
- A distribution-free theory of nonparametric regression
- Approximation properties of a multilayered feedforward artificial neural network
- Approximation spaces of deep neural networks
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- Ten Lectures on Wavelets
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Neural Network Learning
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Benign overfitting in linear regression
- Universal Consistency of Deep Convolutional Neural Networks
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Theory of deep convolutional neural networks. III: Approximating radial functions
- Neural network approximation and estimation of classifiers with classification boundary in a Barron class