A Quantitative Functional Central Limit Theorem for Shallow Neural Networks
From MaRDI portal
Publication:6441986
Abstract: We prove a Quantitative Functional Central Limit Theorem for one-hidden-layer neural networks with generic activation function. The rates of convergence that we establish depend heavily on the smoothness of the activation function, and they range from logarithmic in non-differentiable cases such as the Relu to for very regular activations. Our main tools are functional versions of the Stein-Malliavin approach; in particular, we exploit heavily a quantitative functional central limit theorem which has been recently established by Bourguin and Campese (2020).
Recommendations
- Mean field analysis of neural networks: a central limit theorem
- Rates of approximation by ReLU shallow neural networks
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function
Cited in
(1)
This page was built for publication: A Quantitative Functional Central Limit Theorem for Shallow Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441986)