谷歌浏览器插件
订阅小程序
在清言上使用

Topic Embedded Representation Enhanced Variational Wasserstein Autoencoder for Text Modeling

2022 IEEE 5th International Conference on Electronics Technology (ICET)(2022)

引用 0|浏览4
暂无评分
摘要
Variational Autoencoder (VAE) is now popular in text modeling and language generation tasks, which need to pay attention to the diversity of generation results. The existing models are insufficient in capturing the built-in relationships between topic representation and sequential words. At the same time, there is a massive contradiction between the commonly used simple Gaussian prior and the actual complex distribution of language texts. To address the above problems, we introduce a hybrid Wasserstein Autoencoder (WAE) with Topic Embedded Representation (TER) for text modeling. TER is obtained through an embedding-based topic model and can capture the dependencies and semantic similarities between topics and words. In this case, the learned latent variable has rich semantic knowledge with the help of TER and is easier to explain and control. Our experiments show that our method is competitive with other VAEs in text modeling.
更多
查看译文
关键词
variational autoencoders,text modeling,topic embedded representation,Wasserstein autoencoder,topic model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要