Explicit knowledge transfer of graph-based correlation distillation and diversity data hallucination for few-shot object detection

Image and Vision Computing(2024)

引用 0|浏览5
暂无评分
摘要
The performance of few-shot object detection has seen marked improvement through fine-tuning paradigms. However, existing methods often depend on shared parameters to implicitly transfer knowledge without explicit induction. This results in novel-class representations that are easily confused with similar base classes and poorly suited to diverse patterns of variation in the truth distribution. In view of this, the present paper focuses on mining transferable base-class knowledge, which is further subdivided into inter-class correlation and intra-class diversity. First, we design a graph to dynamically capture the relationship between base and novel class representations, and then introduce distillation techniques to tackle the shortage of correlation knowledge in few-shot labels. Furthermore, an efficient diversity knowledge transfer module based on the data hallucination is proposed, which can adaptively disentangle class-independent variation patterns from base-class features and generate additional trainable hallucinated instances for novel classes. Experiments on VOC and COCO datasets confirmed that our proposed method effectively reduces the reliance on novel-class samples and demonstrates superior performance compared to other state-of-the-art baseline methods.
更多
查看译文
关键词
Few-shot object detection,Graph convolutional network,Knowledge distillation,Data hallucination
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要