Adversarial Co distillation Learning for Image Recognition

Haoran Zhang
Haoran Zhang
Zhenzhen Hu
Zhenzhen Hu
Wei Qin
Wei Qin
Mingliang Xu
Mingliang Xu

Pattern Recognition, pp. 1076592020.

Cited by: 1|Bibtex|Views21

Abstract:

Knowledge distillation is an effective way to transfer the knowledge from a pre-trained teacher model to a student model. Co-distillation, as an online variant of distillation, further accelerates the training process and paves a new way to explore the “dark knowledge” by training n models in parallel. In this paper, we explore the “diver...More

Code:

Data:

Your rating :
0

 

Tags
Comments