SE-Prompt: Exploring Semantic Enhancement with Prompt Tuning for Relation Extraction.

Cai Wang, Dongyang Li,Xiaofeng He

Advanced Data Mining and Applications: 19th International Conference, ADMA 2023, Shenyang, China, August 21–23, 2023, Proceedings, Part IV(2023)

引用 0|浏览22
暂无评分
摘要
Compared to traditional supervised learning methods, utilizing prompt tuning for relation extraction tasks is a challenging endeavor in the real world. By inserting a template segment into the input, prompt tuning has proven effective for certain classification tasks. However, applying prompt tuning to relation extraction tasks, which involve mapping multiple words to a single label, poses challenges due to difficulties in precisely defining a template and mapping labels to the appropriate words. Prior approaches do not take full advantage of entities and have also overlooked the semantic connections between words in relation label. To address these limitations, we propose a semantic enhancement with prompt (SE-Prompt) which integrates entity and relation knowledge by incorporating two main contributions: semantic enhancement and subject-object relation refinement. These methods empower our model to effectively leverage relation labels and tap into the knowledge contained in pre-trained models. Our experiments on three datasets, under both fully supervised and low-resource settings demonstrate the effectiveness of our approach for relation extraction.
更多
查看译文
关键词
semantic enhancement,relation,se-prompt tuning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要