Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling

EMNLP, 2018.

Cited by: 19|Bibtex|Views37|Links
EI

Abstract:

Many efforts have been made to facilitate natural language processing tasks with pre-trained language models (PTLM), and brought significant improvements to various applications. To fully leverage the nearly unlimited corpora and capture linguistic information of multifarious levels, large-size LMs are required; but for a specific task, o...More

Code:

Data:

Your rating :
0

 

Tags
Comments