An Autonomous Cognitive Empathy Model Responsive to Users’ Facial Emotion Expressions

ACM Transactions on Interactive Intelligent Systems(2020)

引用 15|浏览35
暂无评分
摘要
AbstractSuccessful social robot services depend on how robots can interact with users. The effective service can be obtained through smooth, engaged, and humanoid interactions in which robots react properly to a user’s affective state. This article proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve longer and more engaged human-robot interactions (HRI) by considering humans’ emotions and replying to them appropriately. The proposed model continuously detects the affective states of a user based on facial expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and tested on the RAVDESS dataset.The overall proposed empathic model is verified throughout an experiment, where different emotions are triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants perceived during interaction with the robot.
更多
查看译文
关键词
Empathy, human robot interaction, social robots, facial emotion detection, adaptive interaction, non-verbal behavior
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要