Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / OpenAlex ID
 
Property / OpenAlex ID: W2981298641 / rank
 
Normal rank

Revision as of 00:33, 20 March 2024

scientific article
Language Label Description Also known as
English
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation
scientific article

    Statements

    Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 January 2022
    0 references
    neural networks
    0 references
    machine learning
    0 references
    statistical physics
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers