Learning Lightweight Face Detector with Knowledge Distillation

2019 International Conference on Biometrics (ICB)(2019)

引用 11|浏览39
暂无评分
摘要
Despite that face detection has progressed significantly in recent years, it is still a challenging task to get a fast face detector with competitive performance, especially on CPU based devices. In this paper, we propose a novel loss function based on knowledge distillation to boost the performance of lightweight face detectors. More specifically, a student detector learns additional soft label from a teacher detector by mimicking its classification map. To make the knowledge transfer more efficient, a threshold function is designed to assign threshold values adaptively for different objectness scores such that only the informative samples are used for mimicking. Experiments on FDDB and WIDER FACE show that the proposed method improves the performance of face detectors consistently. With the help of the proposed training method, we get a CPU real-time face detector that runs at 20 FPS while being state-of-the-art on performance among CPU based detectors.
更多
查看译文
关键词
lightweight face detector,knowledge distillation,face detection,CPU based devices,student detector,teacher detector,threshold function,CPU real-time face detector,CPU based detectors,WIDER FACE,loss function,classification map,knowledge transfer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要