AI帮你理解科学

AI 生成解读视频

AI抽取解析论文重点内容自动生成视频


pub
生成解读视频

AI 溯源

AI解析本论文相关学术脉络


Master Reading Tree
生成 溯源树

TinyBERT: Distilling BERT for Natural Language Understanding

EMNLP, pp.4163-4174, (2020)

被引用95|浏览184
EI
下载 PDF 全文
引用

摘要

Language model pre-training, such as BERT, has significantly improved the performances of many natural language processing tasks. However, pre-trained language models are usually computationally expensive and memory intensive, so it is difficult to effectively execute them on some resource-restricted devices. To accelerate inference and...更多

代码

数据

作者
Jiao Xiaoqi
Jiao Xiaoqi
Yin Yichun
Yin Yichun
Chen Xiao
Chen Xiao
Li Linlin
Li Linlin
Wang Fang
Wang Fang
您的评分 :
0

 

标签
评论
小科