NUTS-BSNN: A non-uniform time-step binarized spiking neural network with energy-efficient in-memory computing macro

NEUROCOMPUTING(2023)

引用 0|浏览0
暂无评分
摘要
This work introduces a network architecture NUTS-BSNN: A Non-uniform Time-step Binarized Spiking Neural Network. NUTS-BSNN is a fully binarized spiking neural network with all binary weights, including the input and output layers. In the input and output layers, the weights are represented as stochastic series of numbers, while in the hidden layers, they are approximated to binary values for using simple XNOR-based computations. To compensate for the information loss due to binarization, we increased the convolutions at the input layer sequentially computed over multiple time-steps. The results from these operations are accumulated before generating spikes for the subsequent layers to increase the overall performance. We chose 14 time-steps for accumulation to achieve a good tradeoff between performance and inference latency. The proposed technique was evaluated using three datasets by direct training method and using a surrogate gradient algorithm. We achieved classification accuracies of 93.25%, 88.71%, and 70.31% on the Fashion-MNIST, CIFAR-10, and CIFAR100 datasets, respectively. Further, we present an in-memory computing architecture for NUTS-BSNN, which limits resource and power consumption for hardware implementation.
更多
查看译文
关键词
Neuromorphic Computing,Binary Spiking Neural Networks,In -memory Computing,Edge-AI Applications
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要