Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
From MaRDI portal
(Redirected from Publication:467694)
Abstract: We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. The dynamics of the neurons is described by a set of stochastic differential equations in discrete time. The neurons interact through the synaptic weights which are Gaussian correlated random variables. We describe the asymptotic law of the network when the number of neurons goes to infinity. Unlike previous works which made the biologically unrealistic assumption that the weights were i.i.d. random variables, we assume that they are correlated. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The result is that the image law through the empirical measure satisfies a large deviation principle with a good rate function. We provide an analytical expression of this rate function in terms of the spectral representation of certain Gaussian processes.
Recommendations
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations for randomly connected neural networks. II: State-dependent interactions
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Large deviations, dynamics and phase transitions in large stochastic and disordered neural networks
- Large deviations for randomly connected neural networks. I: Spatially extended systems
Cites work
- A law of large numbers for random walks in random environment
- Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Microcanonical distributions for lattice gases
- The point of view of the particle on the law of large numbers for random walks in a mixing random environment
Cited in
(11)- Besicovitch almost automorphic stochastic processes in distribution and an application to Clifford-valued stochastic neural networks
- Mean-field limit of generalized Hawkes processes
- A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model
- Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations for randomly connected neural networks. I: Spatially extended systems
- Large deviations for randomly connected neural networks. II: State-dependent interactions
- A large deviation principle and an expression of the rate function for a discrete stationary Gaussian process
- Large deviations for nonlocal stochastic neural fields
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Quenched large deviations for interacting diffusions in random media
This page was built for publication: Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q467694)