Associative memory by recurrent neural networks with delay elements
From MaRDI portal
Publication:1887124
DOI10.1016/S0893-6080(03)00207-7zbMATH Open1082.68098DBLPjournals/nn/MiyoshiYO04arXivcond-mat/0209258OpenAlexW2077938939WikidataQ51728842 ScholiaQ51728842MaRDI QIDQ1887124FDOQ1887124
Authors: Seiji Miyoshi, Hiro-Fumi Yanai, Masato Okada
Publication date: 23 November 2004
Published in: Neural Networks (Search for Journal in Brave)
Abstract: The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of to obtain the macroscopic state at time step t where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e., . These results are supported by computer simulations.
Full work available at URL: https://arxiv.org/abs/cond-mat/0209258
Recommendations
- A simple delayed neural network with large capacity for associative memory
- scientific article; zbMATH DE number 7597615
- Delay for the capacity-simplicity dilemma in associative memory attractor networks
- Statistical mechanics of temporal association in neural networks with transmission delays
- Networks of delay differential equations and associative memories
Cites Work
- Neural networks and physical systems with emergent collective computational abilities
- Notions of associative memory and sparse coding
- Self-consistent signal-to-noise analysis and its application to analogue neural networks with asymmetric connections
- Title not available (Why is that?)
- Transient dynamics for sequence processing neural networks
- Phase diagram and storage capacity of sequence processing neural networks
Cited In (10)
- Modelling memory functions with recurrent neural networks consisting of input compensation units: I. Static situations
- Analysis and Design of Associative Memories Based on Recurrent Neural Networks with Linear Saturation Activation Functions and Time-Varying Delays
- Elements for a general memory structure: properties of recurrent neural networks used to form situation models
- Delay for the capacity-simplicity dilemma in associative memory attractor networks
- Memory in linear recurrent neural networks in continuous time
- Theory of Recurrent Neural Network with Common Synaptic Inputs
- Delay-probability-distribution-dependent stability criteria for discrete-time stochastic neural networks with random delays
- Title not available (Why is that?)
- Time for retrieval in recurrent associative memories
- A simple delayed neural network with large capacity for associative memory
This page was built for publication: Associative memory by recurrent neural networks with delay elements
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1887124)