Bidirectional Gru With Multi-Head Attention For Chinese Ner

PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020)(2020)

引用 8|浏览3
暂无评分
摘要
Named entity recognition (NER) is a basic task of natural language processing (nlp), which purpose is to locate the named entities in natural language text, and classify them into predefined categories such as persons (PER), locations (LOC), and organizations (ORG). Nowadays, deep learning methods have been widely used in this task and have achieved advanced results. However, most of the NER is conducted in English not in Chinese, and because of the complex language characteristics of Chinese itself, the current effect of named entity recognition in Chinese is not very satisfactory. In order to cope with the above problems, this paper proposes a new network structure, which is named BERT-BGRU-MHA-CRF. Experiments show that the model can achieve an F1 value of 94.14% on the MSRA corpus, which is better than the current mainstream model Bi-LSTM-CRF, and achieves a good accuracy on the Chinese named entity recognition task.
更多
查看译文
关键词
Named Entity Recognition, Multi-Head Attention, BERT, Bidirectional GRU, CRF
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要