Chrome Extension
WeChat Mini Program
Use on ChatGLM

In-context Contrastive Learning for Event Causality Identification

EMNLP 2024(2024)

Cited 0|Views45
Abstract
Event Causality Identification (ECI) aims at determining the existence of acausal relation between two events. Although recent prompt learning-basedapproaches have shown promising improvements on the ECI task, their performanceare often subject to the delicate design of multiple prompts and the positivecorrelations between the main task and derivate tasks. The in-context learningparadigm provides explicit guidance for label prediction in the prompt learningparadigm, alleviating its reliance on complex prompts and derivative tasks.However, it does not distinguish between positive and negative demonstrationsfor analogy learning. Motivated from such considerations, this paper proposesan In-Context Contrastive Learning (ICCL) model that utilizes contrastivelearning to enhance the effectiveness of both positive and negativedemonstrations. Additionally, we apply contrastive learning to event pairs tobetter facilitate event causality identification. Our ICCL is evaluated on thewidely used corpora, including the EventStoryLine and Causal-TimeBank, andresults show significant performance improvements over the state-of-the-artalgorithms.
More
Translated text
PDF
Bibtex
AI Read Science
Video&Figures
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:本文提出了一种名为In-Context Contrastive Learning (ICCL)的模型,通过对比学习增强正负示例的有效性,以及将对比学习应用于事件对,以提高事件因果性识别的效果。

方法】:ICCL模型利用对比学习在上下文中为标签预测提供明确指导,区分正负示例进行类比学习。

实验】:在广泛使用的语料库EventStoryLine和Causal-TimeBank上评估了ICCL模型,结果显示其性能显著优于最先进的算法。