Knowledge Enhanced Contextual Word Representations

Mark Neumann
Mark Neumann
Robert Logan
Robert Logan
Vidur Joshi
Vidur Joshi

EMNLP/IJCNLP (1), pp. 43-54, 2019.

Cited by: 131|Views99
EI

Abstract:

Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities. We propose a general method to embed multiple knowledge bases (KBs) into large scale models, and thereby enhance their representatio...More

Code:

Data:

Your rating :
0

 

Tags
Comments