SynthoGestures: A Novel Framework for Synthetic Dynamic Hand Gesture Generation for Driving Scenarios

ADJUNCT PROCEEDINGS OF THE 36TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE & TECHNOLOGY, UIST 2023 ADJUNCT(2023)

引用 0|浏览11
暂无评分
摘要
Creating a diverse and comprehensive dataset of hand gestures for dynamic human-machine interfaces in the automotive domain can be challenging and time-consuming. To overcome this challenge, we propose using synthetic hand gestures generated by virtual 3D models. In this paper, we present our open-source framework that utilizes Unreal Engine to synthesize realistic static and dynamic hand gestures, offering customization options and reducing the risk of overfitting. Multiple variants, including gesture speed, performance, and hand shape, are generated to improve generalizability. In addition, we simulate different camera locations and types, such as RGB, infrared, and depth cameras, without incurring additional time, effort, or cost to obtain these cameras. Experimental results demonstrate that our proposed framework, SynthoGestures, improves gesture recognition accuracy and can replace or augment real-hand datasets. By saving time and effort in the creation of a data set, our tool accelerates the development of gesture recognition systems for automotive and non-automotive applications.
更多
查看译文
关键词
Gesture Recognition,Synthetic data,Data Augmentation,Personalization,Deep Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要