Enhancing Unsupervised Pretraining with External Knowledge for Natural Language Inference.

ADVANCES IN ARTIFICIAL INTELLIGENCE(2019)

引用 12|浏览608
暂无评分
摘要
Unsupervised pretraining such as BERT (Bidirectional Encoder Representations from Transformers) [2] represents the most recent advance on learning representation for natural language, which has helped achieve leading performance on many natural language processing problems. Although BERT can leverage large corpora, we assume it cannot learn all needed semantics and knowledge for natural language inference (NLI). In this paper, we leverage human-authorized external knowledge to further improve BERT, and our results show that BERT, the current state-of-the-art pretraining framework, can benefit from external knowledge.
更多
查看译文
关键词
BERT,Natural Language Inference,External knowledge
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要