Efficient Contextualized Representation: Language Model Pruning for Sequence LabelingEI

摘要

Many efforts have been made to facilitate natural language processing tasks with pre-trained language models (PTLM), and brought significant improvements to various applications. To fully leverage the nearly unlimited corpora and capture linguistic information of multifarious levels, large-size LMs are required; but for a specific task, only parts of these information are useful. Such large models, even in the inference stage, lead to overwhelming computation workloads, thus making them too time-consuming for real-world applications. ...更多
个人信息

 

您的评分 :

EMNLP, 2018.

被引用次数2|引用|0
标签
作者
评论