A relational tsetlin machine with applications to natural language understanding

Journal of Intelligent Information Systems(2022)

引用 16|浏览350
暂无评分
摘要
Tsetlin machines (TMs) are a pattern recognition approach that uses finite state machines for learning and propositional logic to represent patterns. In addition to being natively interpretable, they have provided competitive accuracy for various tasks. In this paper, we increase the computing power of TMs by proposing a first-order logic-based framework with Herbrand semantics. The resulting TM is relational and can take advantage of logical structures appearing in natural language, to learn rules that represent how actions and consequences are related in the real world. The outcome is a logic program of Horn clauses, bringing in a structured view of unstructured data. In closed-domain question-answering, the first-order representation produces 10 × more compact KBs, along with an increase in answering accuracy from 94.83 % to 99.48 % . The approach is further robust towards erroneous, missing, and superfluous information, distilling the aspects of a text that are important for real-world understanding
更多
查看译文
关键词
Tsetlin machine, Natural language processing, Logic programming, Knowledge representation, Question answering
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要