Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function (Q2222227)

From MaRDI portal





scientific article
Language Label Description Also known as
default for all languages
No label defined
    English
    Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function
    scientific article

      Statements

      Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function (English)
      0 references
      0 references
      26 January 2021
      0 references
      There is an investigation on regression estimators based on deep neural networks (DNN). In a previous article, [the author and \textit{M. Kohler}, ``On the rate of convergence of fully connected deep neural network regression estimates'', Preprint, \url{arXiv:1908.11133}], neural networks with rectified linear unit (ReLU) activation function have been considered. The question here is, if the same rate of convergence for fully connected deep neural networks regression estimators with smooth activation function -- the sigmoid -- can be achieved. Indeed, the main result of the present paper, proves that under a set of sufficient conditions, the \(L_2\) -errors of least squares neural network regression estimators based on a set of fully connected DNNs with a fixed number of layers, achieve a similar rate of convergence as in the mentioned article.
      0 references
      curse of dimensionality
      0 references
      deep learning
      0 references
      neural networks
      0 references
      nonparametric regression
      0 references
      rate of convergence
      0 references
      0 references
      0 references
      0 references

      Identifiers

      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references