谷歌浏览器插件
订阅小程序
在清言上使用

Tagging and chunking with bigrams

COLING '00: Proceedings of the 18th conference on Computational linguistics - Volume 2(2000)

引用 28|浏览0
暂无评分
摘要
In this paper we present an integrated system for tagging and chunking texts from a certain language. The approach is based on stochastic finite-state models that are learnt automatically. This includes biagram models or finite-state automata learnt using grammatical inference techniques. As the models involved in our system are learnt automatically, this is a very flexible and portable system.In order to show the viability of our approach we present results for tagging and chunking using bigram models on the Wall Street Journal corpus. We have achieved an accuracy rate for tagging of 96.8%, and a precision rate for NP chunks of 94.6% with a recall rate of 93.6%.
更多
查看译文
关键词
accuracy rate,np chunk,portable system,recall rate,present result,integrated system,finite-state automata learnt,precision rate,chunking text,stochastic finite-state model,finite state automata,integrable system,grammatical inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要