Multimodal Emotion Recognition with Thermal and RGB-D Cameras for Human-Robot Interaction

HRI '20: ACM/IEEE International Conference on Human-Robot Interaction Cambridge United Kingdom March, 2020(2020)

引用 7|浏览10
暂无评分
摘要
Human emotion detection is an important aspect in social robotics and in human-robot interaction (HRI). In this paper, we propose a vision-based multimodal emotion recognition method based on gait data and facial thermal images designed for social robots. Our method can detect four human emotional states (i.e., neutral, happiness, anger, and sadness). We gathered data from 25 participants in order to build-up an emotion database for training and testing our classification models. We implemented and tested several approaches such as Convolutional Neural Network (CNN), Hidden Markov Model (HMM), Support Vector Machine (SVM), and Random Forest (RF). These were trained and tested in order to compare the emotion recognition ability and to find the best approach. We designed a hybrid model with both the gait and the thermal data and the accuracy of our system shows an improvement of 10% over the other models based on our emotion database. This is a promising approach to be explored in a real-time human-robot interaction scenario.
更多
查看译文
关键词
multimodal emotion recognition, thermal face, gait, HRI
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要