A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks

From MaRDI portal
Publication:3544323

DOI10.1162/NECO.2008.05-07-530zbMATH Open1159.68536arXiv0705.3690OpenAlexW1967216659WikidataQ51873083 ScholiaQ51873083MaRDI QIDQ3544323FDOQ3544323


Authors: Benoît Siri, Hugues Berry, Bruno Delord, Mathias Quoy, Bruno Cessac Edit this on Wikidata


Publication date: 5 December 2008

Published in: Neural Computation (Search for Journal in Brave)

Abstract: We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.


Full work available at URL: https://arxiv.org/abs/0705.3690




Recommendations




Cites Work


Cited In (12)





This page was built for publication: A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3544323)