Associative memory by recurrent neural networks with delay elements

From MaRDI portal
Publication:1887124

DOI10.1016/S0893-6080(03)00207-7zbMATH Open1082.68098DBLPjournals/nn/MiyoshiYO04arXivcond-mat/0209258OpenAlexW2077938939WikidataQ51728842 ScholiaQ51728842MaRDI QIDQ1887124FDOQ1887124


Authors: Seiji Miyoshi, Hiro-Fumi Yanai, Masato Okada Edit this on Wikidata


Publication date: 23 November 2004

Published in: Neural Networks (Search for Journal in Brave)

Abstract: The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of O(L4t) to obtain the macroscopic state at time step t where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity alphaC is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e., alphaC=0.195L. These results are supported by computer simulations.


Full work available at URL: https://arxiv.org/abs/cond-mat/0209258




Recommendations




Cites Work


Cited In (10)





This page was built for publication: Associative memory by recurrent neural networks with delay elements

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1887124)