Information and Topology in Attractor Neural Networks
From MaRDI portal
Publication:3440422
DOI10.1162/NECO.2007.19.4.956zbMATH Open1118.68116DBLPjournals/neco/DominguezKSR07arXivcond-mat/0506535OpenAlexW2130189314WikidataQ48248988 ScholiaQ48248988MaRDI QIDQ3440422FDOQ3440422
Authors:
Publication date: 22 May 2007
Published in: Neural Computation (Search for Journal in Brave)
Abstract: A wide range of networks, including small-world topology, can be modelled by the connectivity , and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI), as a function of the load rate and overlap between patterns and retrieval states. We use MI to search for the optimal topology, for storage and attractor properties of the network. We find that, while the largest storage implies an optimal at , the largest basin of attraction leads to an optimal topology at moderate levels of , whenever . This is related to the clustering and path-length of the network. We also build a diagram for the dynamical phases with random and local initial overlap, and show that very diluted networks lose their attractor ability.
Full work available at URL: https://arxiv.org/abs/cond-mat/0506535
Recommendations
- Topology induced instabilities in neural nets with activity-dependent synapses
- Block information and topology in memory networks
- Analytic solution of attractor neural networks on scale-free graphs
- Retrieval properties of diluted attractor neural networks
- Retrieval phase diagrams for attractor neural networks with optimal interactions
Cites Work
- Statistical mechanics of complex networks
- Collective dynamics of `small-world' networks
- Neural networks and physical systems with emergent collective computational abilities
- Notions of associative memory and sparse coding
- Global and local synchrony of coupled neurons in small-world networks
- Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements
- Partially connected models of neural networks
- Parallel dynamics of extremely diluted symmetric \(Q\)-Ising neural networks
Cited In (6)
- Active networks that maximize the amount of information transmission
- Topological model of neural information networks
- Structured patterns retrieval using a metric attractor network: application to fingerprint recognition
- Block information and topology in memory networks
- Topology induced instabilities in neural nets with activity-dependent synapses
- The neural network as a renormalizer of information
This page was built for publication: Information and Topology in Attractor Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3440422)