Joint architecture and knowledge distillation in CNN for Chinese text recognition

Pattern Recognition(2021)

引用 20|浏览42
暂无评分
摘要
•We propose a guideline to distill the architecture and knowledge of pre-trained standard CNNs simultaneously for fast compression and acceleration.•The effectiveness is first verified on offline HCTR. Compared with the baseline CNN, the corresponding compact network can reduce the computational cost by >10×and model size by >8×with negligible accuracy loss.•Furthermore, the proposed method is successfully used to reduce resource consumption of the mainstream backbone networks on CTW and MNIST.
更多
查看译文
关键词
Convolutional neural network,Acceleration and compression,Architecture and knowledge distillation,Offline handwritten Chinese text recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要