TW-TGNN: Two Windows Graph-Based Model for Text Classification

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 5|浏览22
暂无评分
摘要
Text classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification.
更多
查看译文
关键词
Text classification, Graph neural network, Representation learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要