Learning Precise Spike Train–to–Spike Train Transformations in Multilayer Feedforward Neuronal Networks
From MaRDI portal
Publication:5380422
Abstract: We derive a synaptic weight update rule for learning temporally precise spike train to spike train transformations in multilayer feedforward networks of spiking neurons. The framework, aimed at seamlessly generalizing error backpropagation to the deterministic spiking neuron setting, is based strictly on spike timing and avoids invoking concepts pertaining to spike rates or probabilistic models of spiking. The derivation is founded on two innovations. First, an error functional is proposed that compares the spike train emitted by the output neuron of the network to the desired spike train by way of their putative impact on a virtual postsynaptic neuron. This formulation sidesteps the need for spike alignment and leads to closed form solutions for all quantities of interest. Second, virtual assignment of weights to spikes rather than synapses enables a perturbation analysis of individual spike times and synaptic weights of the output as well as all intermediate neurons in the network, which yields the gradients of the error functional with respect to the said entities. Learning proceeds via a gradient descent mechanism that leverages these quantities. Simulation experiments demonstrate the efficacy of the proposed learning framework. The experiments also highlight asymmetries between synapses on excitatory and inhibitory neurons.
Recommendations
- Learning spatiotemporally encoded pattern transformations in structured spiking neural networks
- Training spiking neural networks in the strong coupling regime
- Spiking neural networks for cortical neuronal spike train decoding
- An exact mapping from ReLU networks to spiking neural networks
- Training much deeper spiking neural networks with a small number of time-steps
- Supervised learning in multilayer spiking neural networks
- Robust learning in SpikeProp
Cites work
- A gradient descent rule for spiking neurons emitting multiple spikes
- A novel spike distance
- Connectomic constraints on computation in feedforward networks of spiking neurons
- Error-backpropagation in temporally encoded networks of spiking neurons
- Learning representations by back-propagating errors
- On the Phase-Space Dynamics of Systems of Spiking Neurons. I: Model and Experiments
- On the sensitive dependence on initial conditions of the dynamics of networks of spiking neurons
- Spiking Neuron Models
- Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting
- Supervised learning in multilayer spiking neural networks
Cited in
(9)- On the Algorithmic Power of Spiking Neural Networks
- scientific article; zbMATH DE number 1683882 (Why is no real title available?)
- Robust spike-train learning in spike-event based weight update
- SuperSpike: supervised learning in multilayer spiking neural networks
- An exact mapping from ReLU networks to spiking neural networks
- Multilevel Artificial Neural Network Training for Spatially Correlated Learning
- Spike-Timing Error Backpropagation in Theta Neuron Networks
- Modeling the learning of a spiking neural network with synaptic delays
- A remark on the error-backpropagation learning algorithm for spiking neural networks
This page was built for publication: Learning Precise Spike Train–to–Spike Train Transformations in Multilayer Feedforward Neuronal Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380422)