Dynamic Support Network for Few-Shot Class Incremental Learning

IEEE Transactions on Pattern Analysis and Machine Intelligence(2023)

引用 28|浏览300
暂无评分
摘要
Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old classes and over-fitting new classes. Revealed by our analyses, the problems are caused by feature distribution crumbling, which leads to class confusion when continuously embedding few samples to a fixed feature space. In this study, we propose a Dynamic Support Network (DSN), which refers to an adaptively updating network with compressive node expansion to “support” the feature space. In each training session, DSN tentatively expands network nodes to enlarge feature representation capacity for incremental classes. It then dynamically compresses the expanded network by node self-activation to pursue compact feature representation, which alleviates over-fitting. Simultaneously, DSN selectively recalls old class distributions during incremental learning to support feature distributions and avoid confusion between classes. DSN with compressive node expansion and class distribution recalling provides a systematic solution for the problems of catastrophic forgetting and overfitting. Experiments on CUB, CIFAR-100, and miniImage datasets show that DSN significantly improves upon the baseline approach, achieving new state-of-the-arts.
更多
查看译文
关键词
Compressive network expansion,distribution recalling,few-shot learning,incremental learning,support network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要