Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413)

From MaRDI portal
Revision as of 01:32, 1 March 2024 by SwMATHimport240215 (talk | contribs) (‎Changed an Item)
scientific article
Language Label Description Also known as
English
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation
scientific article

    Statements

    Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 January 2022
    0 references
    neural networks
    0 references
    machine learning
    0 references
    statistical physics
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers