谷歌浏览器插件
订阅小程序
在清言上使用

Memory Replay for Continual Learning with Spiking Neural Networks

2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing (MLSP)(2023)

引用 0|浏览12
暂无评分
摘要
Two of the most impressive features of biological neural networks are their high energy efficiency and their ability to continuously adapt to varying inputs. On the contrary, the amount of power required to train top-performing deep learning models rises as they become more complex. This is the main reason for the increasing research interest in spiking neural networks, which mimic the functioning of the human brain achieving similar performances to artificial neural networks, but with much lower energy costs. However, even this type of network is not provided with the ability to incrementally learn new tasks, with the main obstacle being catastrophic forgetting. This paper investigates memory replay as a strategy to mitigate catastrophic forgetting in spiking neural networks. Experiments are conducted on the MNIST-split dataset in both class-incremental learning and task-free continual learning scenarios.
更多
查看译文
关键词
Spiking neural networks,continual learning,memory replay
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要