谷歌浏览器插件
订阅小程序
在清言上使用

Transformer-based Models for Arabic Online Handwriting Recognition

International journal of advanced computer science and applications/International journal of advanced computer science & applications(2022)

引用 0|浏览11
暂无评分
摘要
Transformer neural networks have increasingly become the neural network design of choice, having recently been shown to outperform state-of-the-art end-to-end (E2E) recurrent neural networks (RNNs). Transformers utilize a self-attention mechanism to relate input frames and extract more expressive sequence representations. Transformers also provide parallelism computation and the ability to capture long dependencies in contexts over RNNs. This work introduces a transformer-based model for the online handwriting recognition (OnHWR) task. As the transformer follows encoder-decoder architecture, we investigated the self-attention encoder (SAE) with two different decoders: a self-attention decoder (SAD) and a connectionist temporal classification (CTC) decoder. The proposed models can recognize complete sentences without the need to integrate with external language modules. We tested our proposed models against two Arabic online handwriting datasets: Online-KHATT and CHAW. On evaluation, SAE-SAD architecture performed better than SAE-CTC architecture. The SAE-SAD model achieved a 5% character error rate (CER) and an 18% word error rate (WER) against the CHAW dataset, and a 22% CER and a 56% WER against the Online-KHATT dataset. The SAE-SAD model showed significant improvements over existing models of the Arabic OnHWR.
更多
查看译文
关键词
Selft attention,Transformer,deep Learning,connectionist temporal classification,convolutional neural networks,Arabic online handwriting recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要