Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation
scientific article

    Statements

    Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 January 2022
    0 references
    0 references
    neural networks
    0 references
    machine learning
    0 references
    statistical physics
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references