An Improved Dual-Channel Network to Eliminate Catastrophic Forgetting

IEEE Transactions on Systems, Man, and Cybernetics: Systems(2022)

引用 5|浏览60
暂无评分
摘要
Catastrophic forgetting is a chronic problem during the online training process of deep neural networks. That is, once a new data set is used to train an existing neural network, the network will lose the ability to recognize the original data set. In literature, online contrastive divergence (CD) with generative replay (GR) exploits the generative capacity of the neural network to facilitate onli...
更多
查看译文
关键词
Training,Neural networks,Markov processes,Task analysis,Training data,Data models,Estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要