Controllable Relation Disentanglement for Few-Shot Class-Incremental Learning
arxiv(2024)
摘要
In this paper, we propose to tackle Few-Shot Class-Incremental Learning
(FSCIL) from a new perspective, i.e., relation disentanglement, which means
enhancing FSCIL via disentangling spurious relation between categories. The
challenge of disentangling spurious correlations lies in the poor
controllability of FSCIL. On one hand, an FSCIL model is required to be trained
in an incremental manner and thus it is very hard to directly control
relationships between categories of different sessions. On the other hand,
training samples per novel category are only in the few-shot setting, which
increases the difficulty of alleviating spurious relation issues as well. To
overcome this challenge, in this paper, we propose a new simple-yet-effective
method, called ConTrollable Relation-disentangLed Few-Shot Class-Incremental
Learning (CTRL-FSCIL). Specifically, during the base session, we propose to
anchor base category embeddings in feature space and construct disentanglement
proxies to bridge gaps between the learning for category representations in
different sessions, thereby making category relation controllable. During
incremental learning, the parameters of the backbone network are frozen in
order to relieve the negative impact of data scarcity. Moreover, a
disentanglement loss is designed to effectively guide a relation
disentanglement controller to disentangle spurious correlations between the
embeddings encoded by the backbone. In this way, the spurious correlation issue
in FSCIL can be suppressed. Extensive experiments on CIFAR-100, mini-ImageNet,
and CUB-200 datasets demonstrate the effectiveness of our CTRL-FSCIL method.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要