Online Ensemble Model Compression using Knowledge Distillation

Devesh Walawalkar
Devesh Walawalkar

european conference on computer vision, pp. 18-35, 2020.

Cited by: 0|Views5

Abstract:

This paper presents a novel knowledge distillation based model compression framework consisting of a student ensemble. It enables distillation of simultaneously learnt ensemble knowledge onto each of the compressed student models. Each model learns unique representations from the data distribution due to its distinct architecture. This ...More

Code:

Data:

Your rating :
0

 

Tags
Comments