A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

From MaRDI portal
Publication:6441986




Abstract: We prove a Quantitative Functional Central Limit Theorem for one-hidden-layer neural networks with generic activation function. The rates of convergence that we establish depend heavily on the smoothness of the activation function, and they range from logarithmic in non-differentiable cases such as the Relu to sqrtn for very regular activations. Our main tools are functional versions of the Stein-Malliavin approach; in particular, we exploit heavily a quantitative functional central limit theorem which has been recently established by Bourguin and Campese (2020).









This page was built for publication: A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441986)