A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

From MaRDI portal
Publication:6441986

DOI10.15559/23-VMSTA238arXiv2306.16932OpenAlexW4389094973MaRDI QIDQ6441986FDOQ6441986


Authors: Valentina Cammarota, Domenico Marinucci, Michele Salvi, Stefano Vigogna Edit this on Wikidata


Publication date: 29 June 2023

Abstract: We prove a Quantitative Functional Central Limit Theorem for one-hidden-layer neural networks with generic activation function. The rates of convergence that we establish depend heavily on the smoothness of the activation function, and they range from logarithmic in non-differentiable cases such as the Relu to sqrtn for very regular activations. Our main tools are functional versions of the Stein-Malliavin approach; in particular, we exploit heavily a quantitative functional central limit theorem which has been recently established by Bourguin and Campese (2020).


Full work available at URL: https://doi.org/10.15559/23-vmsta238




Recommendations




Cited In (1)





This page was built for publication: A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441986)