Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle (Q467694)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
scientific article

    Statements

    Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle (English)
    0 references
    0 references
    0 references
    4 November 2014
    0 references
    It is considered a completely connected network of neurons in which the synaptic weights are Gaussian-correlated random variables. The aim of the article is to investigate the asymptotic behaviour and large deviations of the network when the number of neurons goes to infinity. In the case where the synaptic weights are i.i.d. random variables, with Gaussian zero mean, this problem has been investigated by \textit{H. Sompolinsky} et al. in [``Chaos in random neural networks'', Phys. Rev. Lett. 61, 259--262 (1988; \url{doi:10.1103/PhysRevLett.61.259})]. The present article presents a mathematical framework, including definitions, model and preliminaries. The main result states that: ``\( \Pi^n\) is governed by a large deviation principle with a good rate function \(H\)'', is just announced and not proved. The authors claim that the proof can not be reproduced here because it is too long, the main steps are indicated. For details and proofs, the reader should consult the complete paper by the authors [``A large deviation principle for networks of rate neurons with correlated synaptic weights'', Preprint, \url{arXiv:1302.1029}].
    0 references
    asymptotic behaviour
    0 references
    large deviations
    0 references
    random variables
    0 references
    rate function
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references