Multilingual Model Fine-tuning for Sentiment Analysis

Mohamed L. Elrefai,Mahmoud I. Khalil,Hazem M. Abbas

2023 Eleventh International Conference on Intelligent Computing and Information Systems (ICICIS)(2023)

引用 0|浏览0
暂无评分
摘要
Multilingual language models have decreased the barrier between languages, as it will be helpful overcoming many problems, such as sentiment analysis because the importance of this task is to make good decisions and customize products. Obtaining information from one language can help other languages generalize and understand a task more effectively. In this paper, we propose a general method for sentiment analysis of data that includes data from many languages, which enables all applications to use sentiment analysis results in a language-blind or language-independent manner. We performed experiments on two language combinations (English and Arabic) for sentence-level sentiment classification and found that the model with the final setup after adding translations from one language to another and fine-tuning the multilingual language model for Twitter, was the best setup, achieving for two languages and 71.2% and 68.1% f1-score for English and Arabic, respectively.
更多
查看译文
关键词
Multilingual Language Model,Sentiment Classification,Fine-Tuning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要