谷歌浏览器插件
订阅小程序
在清言上使用

Wiki-based Prompts for Enhancing Relation Extraction Using Language Models

39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024(2024)

引用 0|浏览16
暂无评分
摘要
Prompt-tuning and instruction-tuning of language models have exhibited significant results in few-shot Natural Language Processing (NLP) tasks, such as Relation Extraction (RE), which involves identifying relationships between entities within a sentence. However, the effectiveness of these methods relies heavily on the design of the prompts. A compelling question is whether incorporating external knowledge can enhance the language model's understanding of NLP tasks. In this paper, we introduce wiki-based prompt construction that leverages Wikidata as a source of information to craft more informative prompts for both prompt-tuning and instruction-tuning of language models in RE. Our experiments show that using wiki-based prompts enhances cutting-edge language models in RE, emphasizing their potential for improving RE tasks. Our code and datasets are available at GitHub 1.
更多
查看译文
关键词
Relation Extraction,Language Models,Prompt Construction,knowledge Integration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要