Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets (Q1113827)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets |
scientific article |
Statements
Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets (English)
0 references
1989
0 references
According to Hebb's postulate for learning, information presented to a neural net during a learning session is stored in the synaptic efficacies. Long-term potentiation occurs only if the postsynaptic neuron becomes active in a time window set up by the presynaptic one. We carefully interpret and mathematically implement the Hebb rule so as to handle both stationary and dynamic objects such as single patterns and cycles. Since the natural dynamics contains a rather broad distribution of delays the key idea is to incorporate these delays in the learning session. As theory and numerical simulation show, the resulting procedure is surprisingly robust and faithful. It also turns out that pure Hebbian learning is by selection: the network produces synaptic representations that are selected according to their resonance with the input percepts.
0 references
neurophysiology
0 references
neural net
0 references
Hebb rule
0 references
Hebbian learning
0 references
synaptic representations
0 references