A Quantitative Functional Central Limit Theorem for Shallow Neural Networks
From MaRDI portal
Publication:6441986
DOI10.15559/23-VMSTA238arXiv2306.16932OpenAlexW4389094973MaRDI QIDQ6441986FDOQ6441986
Authors: Valentina Cammarota, Domenico Marinucci, Michele Salvi, Stefano Vigogna
Publication date: 29 June 2023
Abstract: We prove a Quantitative Functional Central Limit Theorem for one-hidden-layer neural networks with generic activation function. The rates of convergence that we establish depend heavily on the smoothness of the activation function, and they range from logarithmic in non-differentiable cases such as the Relu to for very regular activations. Our main tools are functional versions of the Stein-Malliavin approach; in particular, we exploit heavily a quantitative functional central limit theorem which has been recently established by Bourguin and Campese (2020).
Full work available at URL: https://doi.org/10.15559/23-vmsta238
Recommendations
- Mean field analysis of neural networks: a central limit theorem
- Rates of approximation by ReLU shallow neural networks
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function
Random fields (60G60) Artificial neural networks and deep learning (68T07) Functional limit theorems; invariance principles (60F17)
Cited In (1)
This page was built for publication: A Quantitative Functional Central Limit Theorem for Shallow Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441986)