Syntax-Directed Hybrid Attention Network for Aspect-Level Sentiment Analysis.
IEEE ACCESS(2019)
Abstract
Aspect-level sentiment analysis is a fine-grained task in sentiment analysis that aims at detecting sentiment polarity towards a specific target in a sentence. Previous studies focus on using global attention mechanism that attends to all words in the context to model the interaction between target and sentence. However, global attention suffers from assigning high-attention score to irrelevant sentiment words in the cases where sentence contains noisy words or multiple targets. To address this problem, we propose a novel syntax-directed hybrid attention network (SHAN). In SHAN, a global attention is employed to capture coarse information about the target, and a syntax-directed local attention is used to take a look at words syntactically close to the target. An information gate is then utilized to synthesize the information from local and global attention results and adaptively generate a less-noisy and more sentiment-oriented representation. The experimental results on SemEval 2014 Datasets demonstrate the effectiveness of the proposed method.
MoreTranslated text
Key words
Aspect-level sentiment analysis,hybrid attention,syntactic information,gating mechanism
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined