Elements for a general memory structure: properties of recurrent neural networks used to form situation models
From MaRDI portal
Publication:937740
DOI10.1007/s00422-008-0221-5zbMath1145.92305DBLPjournals/bc/MakarovSVHC08OpenAlexW2130012086WikidataQ51890658 ScholiaQ51890658MaRDI QIDQ937740
Manuel G. Velarde, Holk Cruse, Yongli Song, David Hübner, Valeri A. Makarov
Publication date: 15 August 2008
Published in: Biological Cybernetics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00422-008-0221-5
Related Items (4)
Synchronization of heteroclinic circuits through learning in coupled neural networks ⋮ Compact internal representation of dynamic situations: neural network implementing the causality principle ⋮ Prediction-for-CompAction: navigation in social environments using generalized cognitive maps ⋮ Selforganizing memory: Active learning of landmarks used for navigation
Cites Work
- Selforganizing memory: Active learning of landmarks used for navigation
- MMC -- a new numerical approach to the kinematics of complex manipulators
- Introduction to linear algebra
- Modelling memory functions with recurrent neural networks consisting of input compensation units: I. Static situations
- Modelling memory functions with recurrent neural networks consisting of input compensation units: II. Dynamic situations
- Parameter Space Structure of Continuous-Time Recurrent Neural Networks
- Complex dynamics and the structure of small neural networks
- Neural networks and physical systems with emergent collective computational abilities.
- Neurons with graded response have collective computational properties like those of two-state neurons.
This page was built for publication: Elements for a general memory structure: properties of recurrent neural networks used to form situation models