Data-driven learning of feedforward neural networks with different activation functions
From MaRDI portal
Publication:2148705
DOI10.1007/978-3-030-87986-0_6zbMATH Open1497.68417arXiv2107.01702OpenAlexW3207082665MaRDI QIDQ2148705FDOQ2148705
Publication date: 24 June 2022
Abstract: This work contributes to the development of a new data-driven method (D-DM) of feedforward neural networks (FNNs) learning. This method was proposed recently as a way of improving randomized learning of FNNs by adjusting the network parameters to the target function fluctuations. The method employs logistic sigmoid activation functions for hidden nodes. In this study, we introduce other activation functions, such as bipolar sigmoid, sine function, saturating linear functions, reLU, and softplus. We derive formulas for their parameters, i.e. weights and biases. In the simulation study, we evaluate the performance of FNN data-driven learning with different activation functions. The results indicate that the sigmoid activation functions perform much better than others in the approximation of complex, fluctuated target functions.
Full work available at URL: https://arxiv.org/abs/2107.01702
Cites Work
Recommendations
- Deep neural networks with a set of node-wise varying activation functions π π
- Title not available (Why is that?) π π
- Diffusion learning algorithms for feedforward neural networks π π
- Neural learning methods yielding functional invariance π π
- Data processing and feature screening in function approximation: An application to neural networks π π
This page was built for publication: Data-driven learning of feedforward neural networks with different activation functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2148705)