Explanation of Action Plans Through Ontologies.

OTM Conferences(2018)

引用 27|浏览8
暂无评分
摘要
In recent years, more and more AI systems have been included in various aspects of human life, forming human-machine partnership and collaboration. The term Digital Companion can be referred to the embodiment of AI as human's co-worker. Explanations why the AI arrived at specific decisions will be highly beneficial in enabling AI to operate more robustly, clarifying to the user why the AI brought certain choices, and significantly increase the trust between humans and AI. A number of symbolic planners exist, which use heuristic search methods to come up with a sequence of actions to reach a certain goal. So far the explanations to why a planner follows certain decision making series are mostly embedded within the planner's operating style, composing so called glass box explanations. The integration of AI Planning (using PDDL) and Ontologies (using OWL) gives the possibility to use reasoning and generate explanations why, subsequently why-not, certain actions were considered by the AI planner, without relying on the planner's functionality. An extended knowledge base is proportional to aiding the construct of more precise clarifications of the decision making process of the planner. In this paper we present a general architecture for black box plan explanations independent of the nature of the planner and illustrate the approach of integrating PDDL and OWL, as well as using justifications in ontologies to explain why a planner has taken certain actions.
更多
查看译文
关键词
PDDL, Ontologies, OWL, AI planning, Plan explanation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要