Statistical guarantees for regularized neural networks

From MaRDI portal
Publication:6079063

DOI10.1016/J.NEUNET.2021.04.034zbMATH Open1521.68202arXiv2006.00294OpenAlexW3159120966MaRDI QIDQ6079063FDOQ6079063


Authors: M. Taheri, Fang Xie, Johannes Lederer Edit this on Wikidata


Publication date: 28 September 2023

Published in: Neural Networks (Search for Journal in Brave)

Abstract: Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural networks from data, especially for classes of estimators that are used in practice or at least similar to such. In this paper, we develop a general statistical guarantee for estimators that consist of a least-squares term and a regularizer. We then exemplify this guarantee with ell1-regularization, showing that the corresponding prediction error increases at most sub-linearly in the number of layers and at most logarithmically in the total number of parameters. Our results establish a mathematical basis for regularized estimation of neural networks, and they deepen our mathematical understanding of neural networks and deep learning more generally.


Full work available at URL: https://arxiv.org/abs/2006.00294




Recommendations




Cites Work


Cited In (10)





This page was built for publication: Statistical guarantees for regularized neural networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6079063)