Transformer-based Context-aware Sarcasm Detection in Conversation Threads from Social Media
meeting of the association for computational linguistics, pp. 276-280, 2020.
We present a transformer-based sarcasm detection model that accounts for the context from the entire conversation thread for more robust predictions. Our model uses deep transformer layers to perform multi-head attentions among the target utterance and the relevant context in the thread. The context-aware models are evaluated on two datas...More
PPT (Upload PPT)