Representation of Relations by Planes in Neural Network Language Model.

Lecture Notes in Computer Science(2016)

引用 0|浏览7
暂无评分
摘要
Whole brain architecture (WBA) which uses neural networks to imitate a human brain is attracting increased attention as a promising way to achieve artificial general intelligence, and distributed vector representations of words is becoming recognized as the best way to connect neural networks and knowledge. Distributed representations of words have played a wide range of roles in natural language processing, and they have become increasingly important because of their ability to capture a large amount of syntactic and lexical meanings or relationships. Relation vectors are used to represent relations between words, but this approach has some problems; some relations cannot be easily defined, for example, sibling relations, parent-child relations, and many-to-one relations. To deal with these problems, we have created a novel way of representing relations: we represent relations by planes instead of by vectors, and this increases by more than 10 % the accuracy of predicting the relation.
更多
查看译文
关键词
Accuracy Rate, Representation Space, Relation Vector, Euclid Distance Function, Sibling Relation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要