Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets (Q1113827): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claims
Set OpenAlex properties.
 
(2 intermediate revisions by 2 users not shown)
Property / author
 
Property / author: Andreas V. M. Herz / rank
 
Normal rank
Property / author
 
Property / author: Bernhard Sulzer / rank
 
Normal rank
Property / author
 
Property / author: Reimer Kühn / rank
 
Normal rank
Property / author
 
Property / author: J. Leo van Hemmen / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf00204701 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2065821796 / rank
 
Normal rank

Latest revision as of 10:11, 30 July 2024

scientific article
Language Label Description Also known as
English
Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets
scientific article

    Statements

    Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets (English)
    0 references
    0 references
    1989
    0 references
    According to Hebb's postulate for learning, information presented to a neural net during a learning session is stored in the synaptic efficacies. Long-term potentiation occurs only if the postsynaptic neuron becomes active in a time window set up by the presynaptic one. We carefully interpret and mathematically implement the Hebb rule so as to handle both stationary and dynamic objects such as single patterns and cycles. Since the natural dynamics contains a rather broad distribution of delays the key idea is to incorporate these delays in the learning session. As theory and numerical simulation show, the resulting procedure is surprisingly robust and faithful. It also turns out that pure Hebbian learning is by selection: the network produces synaptic representations that are selected according to their resonance with the input percepts.
    0 references
    0 references
    neurophysiology
    0 references
    neural net
    0 references
    Hebb rule
    0 references
    Hebbian learning
    0 references
    synaptic representations
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references