A Multi-Camera Deep Neural Network For Detecting Elevated Alertness In Drivers

2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)(2018)

引用 26|浏览23
暂无评分
摘要
We present a system for the detection of elevated levels of driver alertness in driver-facing video captured from multiple viewpoints. This problem is important in automotive safety as a helpful feedback signal to determine driver engagement and as a means of automatically flagging anomalous driving events. We generated a dataset of videos from 25 participants overseeing an hour each of driving sequences in a simulator consisting of a mixture of normal and near-miss driving events. Our proposed system consists of a deep neural network which fuses information from three driver-facing cameras to estimate moments of elevated driver alertness. A novel aspect of the system is that it learns to actively re-weight the importance of camera inputs depending on their content. We demonstrate that this approach is not only resilient to dropped or occluded frames, but also has significantly improved performance compared to a system trained on any single stream.
更多
查看译文
关键词
affective computing, autonomous driving, deep learning, multi-camera systems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要