Relational Knowledge Distillation

Dongju Kim
Dongju Kim

CVPR, 2019.

Cited by: 44|Views30
EI

Abstract:

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expressed as a form of training the student to mimic output activations of individual data examples represented by the teacher. We introduce a novel approach, dubbed r...More

Code:

Data:

Full Text
Bibtex
Your rating :
0

 

Tags
Comments