When Bio-Inspired Computing meets Deep Learning: Low-Latency, Accurate, & Energy-Efficient Spiking Neural Networks from Artificial Neural Networks
CoRR(2023)
摘要
Bio-inspired Spiking Neural Networks (SNN) are now demonstrating comparable
accuracy to intricate convolutional neural networks (CNN), all while delivering
remarkable energy and latency efficiency when deployed on neuromorphic
hardware. In particular, ANN-to-SNN conversion has recently gained significant
traction in developing deep SNNs with close to state-of-the-art (SOTA) test
accuracy on complex image recognition tasks. However, advanced ANN-to-SNN
conversion approaches demonstrate that for lossless conversion, the number of
SNN time steps must equal the number of quantization steps in the ANN
activation function. Reducing the number of time steps significantly increases
the conversion error. Moreover, the spiking activity of the SNN, which
dominates the compute energy in neuromorphic chips, does not reduce
proportionally with the number of time steps. To mitigate the accuracy concern,
we propose a novel ANN-to-SNN conversion framework, that incurs an
exponentially lower number of time steps compared to that required in the SOTA
conversion approaches. Our framework modifies the SNN integrate-and-fire (IF)
neuron model with identical complexity and shifts the bias term of each batch
normalization (BN) layer in the trained ANN. To mitigate the spiking activity
concern, we propose training the source ANN with a fine-grained L1 regularizer
with surrogate gradients that encourages high spike sparsity in the converted
SNN. Our proposed framework thus yields lossless SNNs with ultra-low latency,
ultra-low compute energy, thanks to the ultra-low timesteps and high spike
sparsity, and ultra-high test accuracy, for example, 73.30% with only 4 time
steps on the ImageNet dataset.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要