EHTrack: Earphone-Based Head Tracking via Only Acoustic Signals

IEEE INTERNET OF THINGS JOURNAL(2024)

引用 0|浏览1
暂无评分
摘要
Head tracking is a technique that allows for the measurement and analysis of human focus and attention, thus enhancing the experience of human-computer interaction (HCI). Nevertheless, current solutions relying on vision and motion sensors exhibit limitations in accuracy, user-friendliness, and compatibility with the majority of commercial off-the-shelf (COTS) devices. To overcome these limitations, we present EHTrack, an earphone-based system that achieves head tracking exclusively through acoustic signals. EHTrack employs acoustic sensing to measure the movement of a pair of earphones, subsequently enabling precise head tracking. In particular, a pair of speakers generates a periodically fluctuating sound field, which the user's two earphones detect. By assessing the distance and angle alterations between the earphones and speakers, we propose a model to determine the user's head movement and orientation. Our evaluation results indicate a high degree of accuracy in both head movement tracking, with an average tracking error of 2.98 cm, and head orientation tracking, with an average error of 1.83. Furthermore, in a deployed exhibition scenario, we attained an accuracy of 89.2% in estimating the user's focus direction.
更多
查看译文
关键词
Acoustic signal processing,human computer interaction,signal processing,systems,user interfaces
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要