Cascaded Scaling Classifier: class incremental learning with probability scaling
CoRR(2024)
摘要
Humans are capable of acquiring new knowledge and transferring learned
knowledge into different domains, incurring a small forgetting. The same
ability, called Continual Learning, is challenging to achieve when operating
with neural networks due to the forgetting affecting past learned tasks when
learning new ones. This forgetting can be mitigated by replaying stored samples
from past tasks, but a large memory size may be needed for long sequences of
tasks; moreover, this could lead to overfitting on saved samples. In this
paper, we propose a novel regularisation approach and a novel incremental
classifier called, respectively, Margin Dampening and Cascaded Scaling
Classifier. The first combines a soft constraint and a knowledge distillation
approach to preserve past learned knowledge while allowing the model to learn
new patterns effectively. The latter is a gated incremental classifier, helping
the model modify past predictions without directly interfering with them. This
is achieved by modifying the output of the model with auxiliary scaling
functions. We empirically show that our approach performs well on multiple
benchmarks against well-established baselines, and we also study each component
of our proposal and how the combinations of such components affect the final
results.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要