Topic Modeling for Short Texts via Word Embedding and Document Correlation

IEEE ACCESS(2020)

引用 28|浏览63
暂无评分
摘要
Topic modeling is a widely studied foundational and interesting problem in the text mining domains. Conventional topic models based on word co-occurrences infer the hidden semantic structure from a corpus of documents. However, due to the limited length of short text, data sparsity impedes the inference process of conventional topic models and causes unsatisfactory results on short texts. In fact, each short text usually contains a limited number of topics, and understanding semantic content of short text needs to the relevant background knowledge. Inspired by the observed information, we propose a regularized non-negative matrix factorization topic model for short texts, named TRNMF. The proposed model leverages pre-trained distributional vector representation of words to overcome the data sparsity problem of short texts. Meanwhile, the method employs the clustering mechanism under document-to-topic distributions during the topic inference by using Gibbs Sampling Dirichlet Multinomial Mixture model. TRNMF integrates successfully both word co-occurrence regularization and sentence similarity regularization into topic modeling for short texts. Through extensive experiments on constructed real-world short text corpus, experimental results show that TRNMF can achieve better results than the state-of-the-art methods in term of topic coherence measure and text classification task.
更多
查看译文
关键词
Topic model,short texts,word embedding,document correlation,non-negative matrix factorization,regularization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要