Training deep convolutional spiking neural networks with spike probabilistic global pooling
From MaRDI portal
Publication:5083579
DOI10.1162/NECO_A_01480zbMATH Open1492.68119OpenAlexW4214890633MaRDI QIDQ5083579FDOQ5083579
Authors: Shuang Lian, Qianhui Liu, Rui Yan, Gang Pan, Hua Jin Tang
Publication date: 20 June 2022
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_01480
Recommendations
- Analyzing and accelerating the bottlenecks of training deep SNNs with backpropagation
- An FPGA implementation of deep spiking neural networks for low-power and fast classification
- The convergence analysis of spikeprop algorithm with smoothing \(L_{1/2}\) regularization
- Spiking neural networks: model, learning algorithms and applications
- Skip-connected self-recurrent spiking neural networks with joint intrinsic parameter and synaptic weight training
Cited In (5)
- Skip-connected self-recurrent spiking neural networks with joint intrinsic parameter and synaptic weight training
- Training much deeper spiking neural networks with a small number of time-steps
- Few-shot learning in spiking neural networks by multi-timescale optimization
- Analyzing and accelerating the bottlenecks of training deep SNNs with backpropagation
- An FPGA implementation of deep spiking neural networks for low-power and fast classification
This page was built for publication: Training deep convolutional spiking neural networks with spike probabilistic global pooling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5083579)