Multiscale analysis of slow-fast neuronal learning models with noise
DOI10.1186/2190-8567-2-13zbMath1291.60119OpenAlexW2109173863WikidataQ43248040 ScholiaQ43248040MaRDI QIDQ2251507
Gilles Wainrib, Mathieu N. Galtier
Publication date: 14 July 2014
Published in: The Journal of Mathematical Neuroscience (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/2190-8567-2-13
stochastic differential equationsmodel reductionunsupervised learningaveragingslow-fast systemsHebbian learningSTDPrecurrent networksinhomogeneous Markov process
Stochastic ordinary differential equations (aspects of stochastic analysis) (60H10) Neural biology (92C20) Averaging method for ordinary differential equations (34C29) Ordinary differential equations and systems with randomness (34F05) Applications of Brownian motions and diffusion theory (population genetics, absorption problems, etc.) (60J70)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks. IV: Structuring synaptic pathways among recurrent connections
- Asymptotic behavior in time periodic parabolic problems with unbounded coefficients
- On an explicit representation of the solution of linear stochastic partial differential equations with delays
- Phenomenological models of synaptic plasticity based on spike timing
- Mathematical foundations of neuroscience
- Geometric singular perturbation theory for ordinary differential equations
- A simplified neuron model as a principal component analyzer
- Singular perturbation methods for ordinary differential equations
- Introduction to functional differential equations
- Mathematical formulations of Hebbian learning
- Spike-timing-dependent plasticity for neurons with recurrent connections
- Spiking Neuron Models
- Strong Convergence of Euler-Type Methods for Nonlinear Stochastic Differential Equations