Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413): Difference between revisions

From MaRDI portal
Changed an Item
Changed an Item
Property / describes a project that uses
 
Property / describes a project that uses: PRMLT / rank
 
Normal rank

Revision as of 00:30, 1 March 2024

scientific article
Language Label Description Also known as
English
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation
scientific article

    Statements

    Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 January 2022
    0 references
    0 references
    neural networks
    0 references
    machine learning
    0 references
    statistical physics
    0 references
    0 references
    0 references
    0 references