OBJECT-ORIENTED RELATIONAL DISTILLATION FOR OBJECT DETECTION

2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021)(2021)

引用 2|浏览22
暂无评分
摘要
Object detection models have achieved increasingly better performance based on more complex architecture designs, but the heavy computation limits their further widespread application on the devices with insufficient computational power. To this end, we propose a novel Object-Oriented Relational Distillation (OORD) method that drives small detection models to have an effective performance like large detection models with constant efficiency. Here, we introduce to distill relative relation knowledge from teacher/large models to student/small models, which promotes the small models to learn better soft feature representation by the guiding of large models. OORD consists of two parts, i.e., Object Extraction (OE) and Relation Distillation (RD). OE extracts foreground features to avoid background feature interference, and RD distills the relative relations between the foreground features through graph convolution. Related experiments conducted on various kinds of detection models show the effectiveness of OORD, which improves the performance of the small model by nearly 10% without additional inference time cost.
更多
查看译文
关键词
Object detection, object-oriented relational distillation, knowledge distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要