An Enhanced Gated Recurrent Unit with Auto-Encoder for Solving Text Classification Problems

ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING(2021)

引用 11|浏览1
暂无评分
摘要
Classification has become an important task for automatically categorizing documents based on their respective group. The purpose of classification is to assign the pre-specified group or class to an instance based on the observed features related to that instance. For accurate text classification, feature selection techniques are normally used to identify important features and to remove irrelevant, undesired and noisy features for minimizing the dimensionality of feature space. Therefore, in this research, a new model namely Encoder Simplified GRU (ES-GRU) is proposed to reduce dimension of data using an auto-encoder (AE). Gated Recurrent Unit (GRU) is a deep learning algorithm that contains update gate and reset gate, which is considered as one of the most efficient text classification technique, specifically on sequential datasets. Accordingly, the reset gate is replaced with an update gate in order to reduce the redundancy and complexity in the standard GRU. The proposed model has been evaluated on five benchmark text datasets and compared with six baseline well-known text classification approaches, which includes standard GRU, AE, Long Short-Term Memory, Convolutional Neural Network, Support Vector Machine, and Naïve Bayes. Based on various types of performance evaluation parameters, a considerable amount of improvement has been observed in the performance of the proposed model as compared to state-of-the-art approaches.
更多
查看译文
关键词
RNN, LSTM, RU, Auto-encoder, Text classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要