Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle (Q467694): Difference between revisions
From MaRDI portal
Changed an Item |
ReferenceBot (talk | contribs) Changed an Item |
||
Property / cites work | |||
Property / cites work: Microcanonical distributions for lattice gases / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Asymptotic description of stochastic neural networks. II: Characterization of the limit law / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Large deviations and mean-field theory for asymmetric random recurrent neural networks / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: The point of view of the particle on the law of large numbers for random walks in a mixing random environment / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: A law of large numbers for random walks in random environment / rank | |||
Normal rank |
Revision as of 06:46, 9 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle |
scientific article |
Statements
Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle (English)
0 references
4 November 2014
0 references
It is considered a completely connected network of neurons in which the synaptic weights are Gaussian-correlated random variables. The aim of the article is to investigate the asymptotic behaviour and large deviations of the network when the number of neurons goes to infinity. In the case where the synaptic weights are i.i.d. random variables, with Gaussian zero mean, this problem has been investigated by \textit{H. Sompolinsky} et al. in [``Chaos in random neural networks'', Phys. Rev. Lett. 61, 259--262 (1988; \url{doi:10.1103/PhysRevLett.61.259})]. The present article presents a mathematical framework, including definitions, model and preliminaries. The main result states that: ``\( \Pi^n\) is governed by a large deviation principle with a good rate function \(H\)'', is just announced and not proved. The authors claim that the proof can not be reproduced here because it is too long, the main steps are indicated. For details and proofs, the reader should consult the complete paper by the authors [``A large deviation principle for networks of rate neurons with correlated synaptic weights'', Preprint, \url{arXiv:1302.1029}].
0 references
asymptotic behaviour
0 references
large deviations
0 references
random variables
0 references
rate function
0 references
0 references
0 references
0 references