Knowledge Enhanced Contextual Word Representations
EMNLP/IJCNLP (1), pp. 43-54, 2019.
Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities. We propose a general method to embed multiple knowledge bases (KBs) into large scale models, and thereby enhance their representatio...More
PPT (Upload PPT)