Dynamic semantic structure distillation for low-resolution fine-grained recognition

PATTERN RECOGNITION(2024)

引用 0|浏览6
暂无评分
摘要
Low-resolution images are ubiquitous in real applications such as surveillance and mobile photography. However, existing fine-grained approaches usually suffer catastrophic failures when dealing with low-resolution inputs. This is because their learning strategy inherently depends on the semantic structure of the pre trained model, resulting in poor robustness and generalization. To mitigate this limitation, we propose a dynamic semantic structure distillation learning framework. Our method first facilitates knowledge distillation of diverse semantic structures by perturbing the composition of semantic components and then utilizes a decoupled distillation objective to prevent the loss of primary semantic part relation knowledge. We evaluate our proposed approach on two knowledge distillation tasks: high-to-low resolution and large-to-small model. The experimental results show that our proposed approach significantly outperforms existing methods in low-resolution fine-grained image classification tasks. This indicates that it can effectively distill knowledge from high-resolution teacher models to low-resolution student models. Furthermore, we demonstrate the effectiveness of our approach in general image classification and standard knowledge distillation tasks.
更多
查看译文
关键词
Low-resolution,Fine-grained recognition,Image classification,Distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要