Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle

From MaRDI portal
Publication:467694

DOI10.1016/J.CRMA.2014.08.018zbMATH Open1305.92014arXiv1407.2457OpenAlexW1968149591MaRDI QIDQ467694FDOQ467694


Authors: Olivier Faugeras, James Maclaurin Edit this on Wikidata


Publication date: 4 November 2014

Published in: Comptes Rendus. Mathématique. Académie des Sciences, Paris (Search for Journal in Brave)

Abstract: We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. The dynamics of the neurons is described by a set of stochastic differential equations in discrete time. The neurons interact through the synaptic weights which are Gaussian correlated random variables. We describe the asymptotic law of the network when the number of neurons goes to infinity. Unlike previous works which made the biologically unrealistic assumption that the weights were i.i.d. random variables, we assume that they are correlated. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The result is that the image law through the empirical measure satisfies a large deviation principle with a good rate function. We provide an analytical expression of this rate function in terms of the spectral representation of certain Gaussian processes.


Full work available at URL: https://arxiv.org/abs/1407.2457




Recommendations




Cites Work


Cited In (11)





This page was built for publication: Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q467694)