Modeling and contractivity of neural-synaptic networks with Hebbian learning (Q6550233)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Modeling and contractivity of neural-synaptic networks with Hebbian learning |
scientific article; zbMATH DE number 7860006
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Modeling and contractivity of neural-synaptic networks with Hebbian learning |
scientific article; zbMATH DE number 7860006 |
Statements
Modeling and contractivity of neural-synaptic networks with Hebbian learning (English)
0 references
5 June 2024
0 references
Motivated by some biological considerations, neural synaptic systems that combine Hopfield neural networks (HNN) and firing -- rate neural networks (FRNN) for the neural dynamics with two different Hebbian learning (HL) rules for the synaptic dynamics, are studied. A review of some basics as: contraction, composite norms, Metzler matrix, Out-incidence and in-incidence matrices, is done in the second section of the paper. The third section is devoted to the presentation of the models. The dynamics of Hopfield neural networks, firing-rate neural networks as well as the Hebbian learning rule (HL), the anti-Hebbian learning rule and the Oja's like learning rule are defined. Using these neural and synaptic dynamics three coupled neural synaptic models are considered. The first one, Hopfield-Hebbian model, combines the HNN dynamics with the HL learning rule, under some initial neural and synaptic conditions. The second one, the firing-rate-Hebbian model, combines the FRNN dynamics with the HL learning rule, under some initial neural and synaptic conditions. The third one, the Hopfield-Oja model, is the combination of the HNN dynamics with the Oja's like synaptic plasticity rule, still under some initial neural and synaptic conditions. Assumptions on bounds for activation functions and external stimuli are introduced. Further, these three models will be written in a low-dimensional formulation. In the fourth section one investigates the boundedness of solutions to the models, the contractivity of the models and whether the models satisfy the Dale's principle. The theoretical results in this article are validated by an example for the Hopfield-Hebbian model. Conclusions and future work are presented in the last section of the article.\N\NAn interesting and comprehensive article.
0 references
nonlinear network systems
0 references
Hebbian/anti-Hebbian learning, contraction theory
0 references
Hopfield Neural Networks
0 references
0 references
0 references
0.7655135989189148
0 references
0.7582852840423584
0 references
0.7328776121139526
0 references
0.7317020893096924
0 references
0.7313209772109985
0 references