Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
From MaRDI portal
Publication:467694
DOI10.1016/j.crma.2014.08.018zbMath1305.92014arXiv1407.2457OpenAlexW1968149591MaRDI QIDQ467694
Olivier Faugeras, James N. Maclaurin
Publication date: 4 November 2014
Published in: Comptes Rendus. Mathématique. Académie des Sciences, Paris (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1407.2457
Neural networks for/in biological studies, artificial life and related topics (92B20) Interacting random processes; statistical mechanics type models; percolation theory (60K35) Large deviations (60F10)
Related Items (7)
A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model ⋮ A large deviation principle and an expression of the rate function for a discrete stationary Gaussian process ⋮ Quenched large deviations for interacting diffusions in random media ⋮ Mean-field limit of generalized Hawkes processes ⋮ Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle ⋮ Asymptotic description of stochastic neural networks. II: Characterization of the limit law ⋮ Large deviations for randomly connected neural networks: I. Spatially extended systems
Cites Work
- Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Microcanonical distributions for lattice gases
- The point of view of the particle on the law of large numbers for random walks in a mixing random environment
- A law of large numbers for random walks in random environment
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
This page was built for publication: Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle