Exploiting the Semantic Knowledge of Pre-trained Text-Encoders for Continual Learning
arxiv(2024)
Abstract
Deep neural networks (DNNs) excel on fixed datasets but struggle with
incremental and shifting data in real-world scenarios. Continual learning
addresses this challenge by allowing models to learn from new data while
retaining previously learned knowledge. Existing methods mainly rely on visual
features, often neglecting the rich semantic information encoded in text. The
semantic knowledge available in the label information of the images, offers
important semantic information that can be related with previously acquired
knowledge of semantic classes. Consequently, effectively leveraging this
information throughout continual learning is expected to be beneficial. To
address this, we propose integrating semantic guidance within and across tasks
by capturing semantic similarity using text embeddings. We start from a
pre-trained CLIP model, employ the Semantically-guided Representation
Learning (SG-RL) module for a soft-assignment towards all current task
classes, and use the Semantically-guided Knowledge Distillation (SG-KD) module
for enhanced knowledge transfer. Experimental results demonstrate the
superiority of our method on general and fine-grained datasets. Our code can be
found in
https://github.com/aprilsveryown/semantically-guided-continual-learning.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined