谷歌浏览器插件
订阅小程序
在清言上使用

Few-Shot Class Incremental Learning Via Robust Transformer Approach

INFORMATION SCIENCES(2024)

引用 0|浏览17
暂无评分
摘要
Few-Shot Class-Incremental Learning (FSCIL)presents an extension of the Class Incremental Learning (CIL)problem where a model is faced with the problem of data scarcity while addressing the Catastrophic Forgetting (CF)problem. This problem remains an open problem because all recent works are built upon the Convolutional Neural Networks (CNNs)performing sub-optimally compared to the transformer approaches. Our paper presents Robust Transformer Approach (ROBUSTA)built upon the Compact Convolutional Transformer (CCT). The issue of overfitting due to few samples is overcome with the notion of the stochastic classifier, where the classifier's weights are sampled from a distribution with mean and variance vectors, thus increasing the likelihood of correct classifications, and the batch-norm layer to stabilize the training process. The issue of CFis dealt with the idea of delta parameters, small task-specific trainable parameters while keeping the backbone networks frozen. A non-parametric approach is developed to infer the delta parameters for the model's predictions. The prototype rectification approach is applied to avoid biased prototype calculations due to the issue of data scarcity. The advantage of ROBUSTAis demonstrated through a series of experiments in the benchmark problems where it is capable of outperforming prior arts with big margins without any data augmentation protocols.
更多
查看译文
关键词
Continual learning,Class incremental learning,Few-shot class-incremental learning,Few-shot learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要