Spike-based Residual Blocks

arxiv(2021)

引用 2|浏览60
暂无评分
摘要
Deep Spiking Neural Networks (SNNs) are harder to train than ANNs because of their discrete binary activation and spatio-temporal domain error back-propagation. Considering the huge success of ResNet in ANNs' deep learning, it is natural to attempt to use residual learning to train deep SNNs. Previous Spiking ResNet used a similar residual block to the standard block of ResNet, which we regard as inadequate for SNNs and which still causes the degradation problem. In this paper, we propose the spike-element-wise (SEW) residual block and prove that it can easily implement the residual learning. We evaluate our SEW ResNet on ImageNet. The experiment results show that the SEW ResNet can obtain higher performance by simply adding more layers, providing a simple method to train deep SNNs.
更多
查看译文
关键词
residual blocks,spike-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要