A Non-contact Framework based on Thermal and Visual Imaging for Classification of Affective States during HCI

2020 4th International Conference on Trends in Electronics and Informatics (ICOEI)(48184)(2020)

引用 4|浏览1
暂无评分
摘要
This paper presents a non-contact system based on twin channels of thermal and visual image sequences to register the affective states of an individual during Human-Computer Interaction (HCI). The negative affective states such as stress, anxiety, and depression in students have raised significant concerns. This necessitates a smart HCI system as an assisting tool for psychologists. In this paper, we propose a two-stage smart system for classifying the affective state by clustering of sequences of emotional states. The first stage obtains the dominant emotional state by an ensemble of cues from visual and thermal facial images using a newly proposed cascaded Convolutional Neural Network (CNN) model. We have named this 16-layered network as the EmoScale, as it classifies the dominant emotional state of an individual. The second stage clusters a sequence of the obtained emotional states using a trained Hidden Markov model (HMM) as one of the four dominant affect stress, depression, anxiety, or healthy. We perform five-fold cross-validation of EmoScale on our self prepared data-set as well as the hetero- face database. The performance of the second stage has compared with a standard Depression Anxiety Stress Scale (DASS) with 51 subjects, and the results are found to be promising.
更多
查看译文
关键词
EmoScale,Affect,SCNN,Facial Expression,HMM,HCI
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要