Knowledge Base Completion by Variational Bayesian Neural Tensor Decomposition

Cognitive Computation(2018)

引用 19|浏览75
暂无评分
摘要
Knowledge base completion is an important research problem in knowledge bases, which play important roles in question answering, information retrieval, and other applications. A number of relational learning algorithms have been proposed to solve this problem. However, despite their success in modeling the entity relations, they are not well founded in a Bayesian manner and thus are hard to model the prior information of the entity and relation factors. Furthermore, they under-represent the interaction between entity and relation factors. In order to avoid these disadvantages, we provide a neural-inspired approach, namely Bayesian Neural Tensor Decomposition approach for knowledge base completion based on the Stochastic Gradient Variational Bayesian framework. We employ a multivariate Bernoulli likelihood function to represent the existence of facts in knowledge graphs. We further employ a Multi-layered Perceptrons to represent more complex interactions between the latent subject , predicate , and object factors. The SGVB framework can enable us to make efficient approximate variational inference for the proposed nonlinear probabilistic tensor decomposition by a novel local reparameterization trick. This way avoids the need of expensive iterative inference schemes such as MCMC and does not make any over-simplified assumptions about the posterior distributions, in contrary to the common variational inference. In order to evaluate the proposed model, we have conducted experiments on real-world knowledge bases, i.e., FreeBase and WordNet. Experimental results have indicated the promising performance of the proposed method.
更多
查看译文
关键词
Knowledge base completion, Variational Bayesian, Neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要