Scalable Probabilistic Entity-Topic Modeling.
CoRR(2013)
摘要
We present an LDA approach to entity disambiguation. Each topic is associated with a Wikipedia article and topics generate either content words or entity mentions. Training such models is challenging because of the topic and vocabulary size, both in the millions. We tackle these problems using a novel distributed inference and representation framework based on a parallel Gibbs sampler guided by the Wikipedia link graph, and pipelines of MapReduce allowing fast and memory-frugal processing of large datasets. We report state-of-the-art performance on a public dataset.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络