HCI on the Table: Robust Gesture Recognition Using Acoustic Sensing in Your Hand

IEEE ACCESS(2020)

引用 11|浏览24
暂无评分
摘要
The paper proposes a new HCI mechanism for device-free gesture recognition on the table using acoustic signal, which can extend the gesture input and interactions beyond the tiny screen of mobile device and allow users to provide input without blocking screen view. Previous researches have either relied on additional devices (e.g., special wearable device and mouse) or required active acoustic signals which demand additional cost and be less prone to popularize, while we explore the device-free gesture recognition using passive acoustic signals. This technology is more challenging due to the lack of an effective approach to eliminate the inherent ambient noise disturbances and extract stable gesture features. We fuse both short time energy (STE) and zero-crossing rate (ZCR) to identify the effective signals from the original input, and leverage the Mel frequency cepstral coefficients (MFCC), cochlear filter cepstral coefficients (CFCC) to extract the stable features from different gestures. The unique features in support vector machine (SVM) classifier achieve a high gesture recognition accuracy from the noisy scenarios and mismatched conditions. Implementation on the Android system has realized real-time processing of the feature extraction and gesture recognition. Extensive evaluations show our algorithm has a better noise tolerant performance and the system could recognize seven common gestures (click, flip left/right, scroll up/down, zoom in/out) on smart devices with an accuracy of 93.2%.
更多
查看译文
关键词
Human-computer interaction,acoustic sensing,gesture recognition,MFCC,CFCC,Android system
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要