A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks
From MaRDI portal
Publication:3544323
DOI10.1162/NECO.2008.05-07-530zbMATH Open1159.68536arXiv0705.3690OpenAlexW1967216659WikidataQ51873083 ScholiaQ51873083MaRDI QIDQ3544323FDOQ3544323
Authors: Benoît Siri, Hugues Berry, Bruno Delord, Mathias Quoy, Bruno Cessac
Publication date: 5 December 2008
Published in: Neural Computation (Search for Journal in Brave)
Abstract: We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
Full work available at URL: https://arxiv.org/abs/0705.3690
Recommendations
- Emergence of symmetric, modular, and reciprocal connections in recurrent networks with Hebbian learning
- Stable and convergent dynamics for discrete-time recurrent networks
- A learning rule with generalized Hebbian synapses
- Instability of frozen-in states in synchronous Hebbian neural networks
- A Geometrical Analysis of Global Stability in Trained Feedback Networks
Cites Work
- Complex networks: structure and dynamics
- Network synchronization: Spectral versus statistical properties
- Smooth dynamics and new theoretical ideas in nonequilibrium statistical mechanics
- Positive and negative circuits in dynamical systems
- Linear response, susceptibility and resonances in chaotic toy models
- Transmitting a signal by amplitude modulation in a chaotic network
- CONTROL OF THE TRANSITION TO CHAOS IN NEURAL NETWORKS WITH RANDOM CONNECTIVITY
- Absolute stability criterion for discrete time neural networks
- Resonant spatiotemporal learning in large random recurrent networks
Cited In (12)
- Neural network spectral robustness under perturbations of the underlying graph
- Real and complex behavior for networks of coupled logistic maps
- A discrete dynamical system as a model of a neural network with generalized Hebbian synapses
- Linear response in neuronal networks: From neurons dynamics to collective response
- Mathematical studies of the dynamics of finite-size binary neural networks: a review of recent progress
- A learning rule with generalized Hebbian synapses
- Optimizing synchronizability in networks of coupled systems
- Learning-induced pattern classification in a chaotic neural network
- How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation
- Emergence of symmetric, modular, and reciprocal connections in recurrent networks with Hebbian learning
- Analysis of Linsker's application of Hebbian rules to linear networks
- Modeling and contractivity of neural-synaptic networks with Hebbian learning
This page was built for publication: A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3544323)