Proportional Myoelectric Control in a Virtual Reality Environment

2022 IEEE 14th Image, Video, and Multidimensional Signal Processing Workshop (IVMSP)(2022)

引用 0|浏览12
暂无评分
摘要
Translating input modalities such as hand interactions, speech, and eye tracking in virtual reality offers an immersive user experience. Especially, it is crucial to track the user’s hand gestures, since they can help in translating user intentions into actions in virtual environments. In this work, we developed a virtual reality application which incorporates electromyography-based deep learning methods for recognizing and estimating hand movements in an online fashion. Our application automates all user controls, providing an immense potential for rehabilitation purposes.
更多
查看译文
关键词
virtual reality environment,input modalities,hand interactions,eye tracking,immersive user experience,user intentions,virtual environments,virtual reality application,deep learning methods,recognizing estimating hand movements,application automates all user controls,proportional myoelectric control
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要