Online Ensemble Model Compression using Knowledge Distillation
european conference on computer vision, pp. 18-35, 2020.
This paper presents a novel knowledge distillation based model compression framework consisting of a student ensemble. It enables distillation of simultaneously learnt ensemble knowledge onto each of the compressed student models. Each model learns unique representations from the data distribution due to its distinct architecture. This ...More
PPT (Upload PPT)