Retrieval dynamics of neural networks for sparsely coded sequential patterns
From MaRDI portal
Publication:4259795
DOI10.1088/0305-4470/31/36/004zbMATH Open0925.82162arXivcond-mat/9805135OpenAlexW3123904788MaRDI QIDQ4259795FDOQ4259795
Authors: Katsunori Kitano, Toshio Aoyagi
Publication date: 21 November 1999
Published in: Journal of Physics A: Mathematical and General (Search for Journal in Brave)
Abstract: It is well known that a sparsely coded network in which the activity level is extremely low has intriguing equilibrium properties. In the present work, we study the dynamical properties of a neural network designed to store sparsely coded sequential patterns rather than static ones. Applying the theory of statistical neurodynamics, we derive the dynamical equations governing the retrieval process which are described by some macroscopic order parameters such as the overlap. It is found that our theory provides good predictions for the storage capacity and the basin of attraction obtained through numerical simulations. The results indicate that the nature of the basin of attraction depends on the methods of activity control employed. Furthermore, it is found that robustness against random synaptic dilution slightly deteriorates with the degree of sparseness.
Full work available at URL: https://arxiv.org/abs/cond-mat/9805135
Cited In (4)
- Stability and long term behavior of a Hebbian network of Kuramoto oscillators
- Stability in a Hebbian Network of Kuramoto Oscillators with Second-Order Couplings for Binary Pattern Retrieve
- Self-Regulation Mechanism of Temporally Asymmetric Hebbian Plasticity
- Mutual information and self-control of a fully-connected low-activity neural network
This page was built for publication: Retrieval dynamics of neural networks for sparsely coded sequential patterns
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4259795)