Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413): Difference between revisions
From MaRDI portal
Latest revision as of 18:15, 27 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation |
scientific article |
Statements
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
0 references
19 January 2022
0 references
neural networks
0 references
machine learning
0 references
statistical physics
0 references
0 references