Semi-supervised sequence tagging with bidirectional language models

meeting of the association for computational linguistics, 2017.

被引用299|引用|浏览123|DOI:https://doi.org/10.18653/v1/P17-1161
EI
其它链接dblp.uni-trier.de|academic.microsoft.com|arxiv.org

摘要

Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks. However, in most cases, the recurrent network that operates on word-level representations to produce context sensitive representations is trained on relatively little labeled data. In this paper, we demon...更多

代码

数据

下载 PDF 全文
您的评分 :
0

 

标签
评论