Distilling GRU with Data Augmentation for Unconstrained Handwritten Text Recognition

2018 16th International Conference on Frontiers in Handwriting Recognition (ICFHR)(2018)

引用 7|浏览49
暂无评分
摘要
Handwritten texts with various styles, such as horizontal, overlapping, vertical, and multi-lines texts, are commonly observed in the community. However, most existing handwriting recognition methods only concentrate on one specific kind of text style. In this paper, we focus on the problem of new unconstrained handwritten text recognition and propose distilling gated recurrent unit (GRU) with a new data augmentation technology to model the complex sequential dynamic of unconstrained handwriting text of various styles. The proposed data augmentation method can synthesize realistic handwritten text datasets including horizontal, vertical, overlap, right-down, screw-rotation, and multi-line situation, which render our framework robust for general purposes. The recommended distilling GRU can not only accelerate the training speed through the distilling stage but also maintain the original recognition accuracy. Experiments on our synthesized handwritten test sets show that the proposed multi-layer GRU performs well on the unconstrained handwriting text recognition problem. On the ICDAR2013 handwritten text recognition benchmark dataset, the proposed framework demonstrates comparable performance with state-of-the-art techniques.
更多
查看译文
关键词
unconstrained,text recognition,data augmentation,rnn
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要