Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
DOI10.1016/J.CRMA.2014.08.018zbMATH Open1305.92014arXiv1407.2457OpenAlexW1968149591MaRDI QIDQ467694FDOQ467694
Authors: Olivier Faugeras, James Maclaurin
Publication date: 4 November 2014
Published in: Comptes Rendus. Mathématique. Académie des Sciences, Paris (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1407.2457
Recommendations
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations for randomly connected neural networks. II: State-dependent interactions
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Large deviations, dynamics and phase transitions in large stochastic and disordered neural networks
- Large deviations for randomly connected neural networks. I: Spatially extended systems
Large deviations (60F10) Interacting random processes; statistical mechanics type models; percolation theory (60K35) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- A law of large numbers for random walks in random environment
- Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Microcanonical distributions for lattice gases
- The point of view of the particle on the law of large numbers for random walks in a mixing random environment
Cited In (11)
- Besicovitch almost automorphic stochastic processes in distribution and an application to Clifford-valued stochastic neural networks
- Mean-field limit of generalized Hawkes processes
- A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model
- Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
- Asymptotic description of stochastic neural networks. II: Characterization of the limit law
- Large deviations for randomly connected neural networks. I: Spatially extended systems
- Large deviations for randomly connected neural networks. II: State-dependent interactions
- A large deviation principle and an expression of the rate function for a discrete stationary Gaussian process
- Large deviations for nonlocal stochastic neural fields
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Quenched large deviations for interacting diffusions in random media
This page was built for publication: Asymptotic description of stochastic neural networks. I: Existence of a large deviation principle
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q467694)