Training Compact Models for Low Resource Entity Tagging Using Pre-trained Language Models
FIFTH WORKSHOP ON ENERGY EFFICIENT MACHINE LEARNING AND COGNITIVE COMPUTING - NEURIPS EDITION (EMC2-NIPS 2019)(2019)
Key words
nlp,transformers,language-modeling,bert,distillation,low-resource
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined