A Broad Neural Network Structure for Class Incremental Learning.

ADVANCES IN NEURAL NETWORKS - ISNN 2018(2018)

引用 6|浏览253
暂无评分
摘要
Class Incremental Learning, learning concepts over time, is a promising research topic. Due to unknowing the number of output classes, researchers have to develop different methods to model new classes while preserving pre-trained performance. However, they will meet the catastrophic forgetting problem. That is, the performance will be deteriorated when updating the pre-trained model using new class data without including old data. Hence, in this paper, we propose a novel learning framework, namely Broad Class Incremental Learning System (BCILS) to tackle the above issue. The BCILS updates the model when there are training data from unknown classes by using the deduced iterative formula. This is different from most of the existing fine-tuning based class incremental learning algorithms. The advantages of the proposed approach including (1) easy to model; (2) flexible structure; (3) pre-trained performance preserved well. Finally, we conduct extensive experiments to demonstrate the superiority of the proposed BCILS.
更多
查看译文
关键词
Class incremental learning,Neural network,Broad learning,Catastrophic forgetting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要