Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory
From MaRDI portal
Publication:420950
DOI10.1007/S00422-012-0488-4zbMATH Open1237.92019DBLPjournals/bc/SacramentoW12arXiv1207.7196OpenAlexW3102277129WikidataQ47997657 ScholiaQ47997657MaRDI QIDQ420950FDOQ420950
Authors: João Sacramento, Andreas Wichert
Publication date: 23 May 2012
Published in: Biological Cybernetics (Search for Journal in Brave)
Abstract: We investigate from a computational perspective the efficiency of the Willshaw synaptic update rule in the context of familiarity discrimination, a binary-answer, memory-related task that has been linked through psychophysical experiments with modified neural activity patterns in the prefrontal and perirhinal cortex regions. Our motivation for recovering this well-known learning prescription is two-fold: first, the switch-like nature of the induced synaptic bonds, as there is evidence that biological synaptic transitions might occur in a discrete stepwise fashion. Second, the possibility that in the mammalian brain, unused, silent synapses might be pruned in the long-term. Besides the usual pattern and network capacities, we calculate the synaptic capacity of the model, a recently proposed measure where only the functional subset of synapses is taken into account. We find that in terms of network capacity, Willshaw learning is strongly affected by the pattern coding rates, which have to be kept fixed and very low at any time to achieve a non-zero capacity in the large network limit. The information carried per functional synapse, however, diverges and is comparable to that of the pattern association case, even for more realistic moderately low activity levels that are a function of network size.
Full work available at URL: https://arxiv.org/abs/1207.7196
Recommendations
Cites Work
- A Mathematical Theory of Communication
- Neural networks and physical systems with emergent collective computational abilities
- On associative memory
- Improving recall from an associative memory
- Information capacity in recurrent McCulloch–Pitts networks with sparsely coded memory states
- Optimising synaptic learning rules in linear associative memories
- Optimal learning rules for familiarity detection
- Neural associative memory with optimal Bayesian learning
- Die Lernmatrix
- Neural Associative Memory and the Willshaw–Palm Probability Distribution
- Information storage in sparsely coded memory nets
- Storage capacity of neural networks: effect of the fluctuations of the number of active neurons per memory
- Memory Capacities for Synaptic and Structural Plasticity
- Dynamics and robustness of familiarity memory
Cited In (3)
This page was built for publication: Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q420950)