Semantic learning in autonomously active recurrent neural networks
From MaRDI portal
Publication:3588971
DOI10.1093/JIGPAL/JZP045zbMATH Open1213.68490arXiv0903.1979OpenAlexW3105472702MaRDI QIDQ3588971FDOQ3588971
Authors: Claudius Gros, Gregor Kaczor
Publication date: 10 September 2010
Published in: Logic Journal of the IGPL (Search for Journal in Brave)
Abstract: The human brain is autonomously active, being characterized by a self-sustained neural activity which would be present even in the absence of external sensory stimuli. Here we study the interrelation between the self-sustained activity in autonomously active recurrent neural nets and external sensory stimuli. There is no a priori semantical relation between the influx of external stimuli and the patterns generated internally by the autonomous and ongoing brain dynamics. The question then arises when and how are semantic correlations between internal and external dynamical processes learned and built up? We study this problem within the paradigm of transient state dynamics for the neural activity in recurrent neural nets, i.e. for an autonomous neural activity characterized by an infinite time-series of transiently stable attractor states. We propose that external stimuli will be relevant during the sensitive periods, {it viz} the transition period between one transient state and the subsequent semi-stable attractor. A diffusive learning signal is generated unsupervised whenever the stimulus influences the internal dynamics qualitatively. For testing we have presented to the model system stimuli corresponding to the bars and stripes problem. We found that the system performs a non-linear independent component analysis on its own, being continuously and autonomously active. This emergent cognitive capability results here from a general principle for the neural dynamics, the competition between neural ensembles.
Full work available at URL: https://arxiv.org/abs/0903.1979
Recommendations
- scientific article; zbMATH DE number 1843140
- Resonant spatiotemporal learning in large random recurrent networks
- Autonomous learning with complex dynamics
- Synthetic modeling of autonomous learning with a chaotic neural network
- Recurrent Infomax Generates Cell Assemblies, Neuronal Avalanches, and Simple Cell-Like Selectivity
recurrent neural networksautonomous neural dynamicsemergent cognitive capabilitiestransient state dynamics
This page was built for publication: Semantic learning in autonomously active recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3588971)