Word Level Domain-Diversity Attention Based LSTM Model for Sentiment Classification

2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC)(2020)

引用 1|浏览18
暂无评分
摘要
Sentiment classification is an important task in Natural Language Processing research and it has considerable application significance. The complexity of human sentimental opinion implies that the hidden information such as application scenes or domains that behind the text may play an important role in the prediction of sentiment polarity. This paper presents a novel model for Sentiment Classification, Domain-Diversity Attention Mechanism based LSTM Model (DDAM-LSTM), integrating word level domain relevant features into an input side attention mechanism of LSTM model. Firstly, we propose a representing and calculating method of domain relevant features for each word according to its context. Then we find that the common words and certain domain-specific words show obvious different distribution states as for domain tendency. On this basis, an attention mechanism is designed to assign scale weights to the words at the input side of LSTM network according to their diversity of domain tendency. By combining this unique attention mechanism with the LSTM model, we achieve the goal of fusing the implied domain knowledge with the Neural Network. Experimental results on three public benchmark datasets show that our proposed model yields obvious performance improvement.
更多
查看译文
关键词
deep learning,attention mechanism,sentiment analysis,natural language processing.
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要