Cross-layer patch alignment and intra-and-inter patch relations for knowledge distillation

2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP(2023)

引用 0|浏览0
暂无评分
摘要
To date, most existing distillation methods exploit coarse-grained instance-level information as valuable knowledge to transfer, such as instance logits, instance features and instance relations. However, the fine-grained knowledge of internal regions and relationships between semantic entities within a single instance are overlooked and not fully explored. To address above limitations, we propose a novel fine-grained patch-level distillation method, dubbed as Patch Aware Knowledge Distillation. PAKD rethink knowledge distillation from a new perspective regarding the significance of cross-layer patch alignment and patch relations within and across instances. Specifically, we first devise a novel cross-layer architecture to fuse patches across stages, which is capable of utilizing multi-level information of the teacher to guide one-level learning of the student. Then, we propose cross-layer patch alignment, allowing the student to be aware of patches discriminatively and find the best way to learn from the teacher. Besides, patch relations within and across instances are leveraged to supervise the structural knowledge distillation in the manifold space. We apply our method to image classification and object detection tasks. Consistent improvements over state-of-the-art approaches on different datasets and diverse teacher-student combinations manifest the great potential of our proposed PAKD.
更多
查看译文
关键词
Knowledge distillation,Cross-layer patch alignment,Intra-and-inter patch relations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要