SCREAM: Knowledge sharing and compact representation for class incremental learning

Zhikun Feng,Mian Zhou,Zan Gao,Angelos Stefanidis, Zezhou Sui

INFORMATION PROCESSING & MANAGEMENT(2024)

引用 0|浏览9
暂无评分
摘要
Methods based on dynamic structures are effective in addressing catastrophic forgetting on Class-incremental learning (CIL). However, they often isolate sub-networks and overlook the integration of overall information, resulting in a performance decline. To overcome this limitation, we recognize the importance of knowledge sharing among sub-networks. On the basis of dynamic network, we established a novel two-stage CIL method called SCREAM that includes an Expandable Network (EN) Learning Stage and a Compact Representation (CR) Stage: (1) design a clustering loss function for EN, aggregating related instances and promoting information sharing; (2) design dynamic weight alignment to alleviate the classifier's bias towards new class knowledge; and (3) design a balanced decoupled distillation for CR, mitigating the impact of the long-tail effect during multiple compressions. To validate the performance of SCREAM, we use 3 widely used datasets and set different Buffersize (replay-buffer) for comparison with the current state-of-the-art models.The result show that on CIFAR-100 ImageNet-100/1000 and Tiny-ImageNet achieve an average accuracy exceeding 2.46%, 1.22% and 1.52%, respectively. When using a smaller buffersize, SCREAM also achieves an average accuracy exceeding 4.60%. Furthermore, SCREAM shows good performance in terms of Resources needed.
更多
查看译文
关键词
Incremental learning,Knowledge sharing,Rehearsal,Knowledge distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要