A Study of Self Distillation for Mango Image Classification

2020 International Computer Symposium (ICS)(2020)

引用 1|浏览4
暂无评分
摘要
We study a knowledge transfer approach called self distillation on a mango image dataset. Taking the deepest part of a convolutional neural network as the teacher, the self distillation approach transfers the relatively richer knowledge of the deepest part to shallow parts of this network, which are viewed as the students. We verify that this approach is effective in the target mango image dataset. Furthermore, we propose two more losses to improve performance considering data characteristics. In the discussion, we not only verify effectiveness of self distillation, but also point out weakness of the current approach, which unveils potential improvement for self distillation in the future.
更多
查看译文
关键词
self distillation,model compression,image classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要