Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration

J. Multimodal User Interfaces(2017)

引用 56|浏览20
暂无评分
摘要
Distributed collaborations between two or more participants on a task involving tangible artifacts (e.g., a machine, a patient, a tool) have become increasingly common in recent years due to rapid development in information and communication technologies. In this paper we focus on a specific type of remote-collaboration system where a remote helper guides a local worker using audio communication and hand gestures to perform a repair or a maintenance task. An established ICT approach to supporting this type of collaboration is to provide a shared visual space and some forms of remote gesture. The shared space typically consists of a video capture of the remote workspace which is then displayed on a 2D screen. However, this type of approach has its limitations. Firstly, it does not provide the helper with sufficient understanding of the spatial relationships between objects in the remote workspace. Secondly, it does not allow the helper to gesture in 3D. In an attempt to address these issues, we propose a Mixed Reality multimodal system that improves on previous 2D systems by introducing 3D real-time capturing and rendering of both the remote workspace and the helping hands and by creating a 3D shared visual space as a result of co-locating the remote workspace with the helping hands. In this system, we explore the possibility of increasing the feeling of immersion and co-presence by using head tracking, stereoscopic rendering, inter-occlusion handling and virtual shadowing. In this paper, we introduce HandsIn3D, a system that has been developed for the purpose of the proof of concept. We also present the results of experiments to verify the feasibility of our approach.
更多
查看译文
关键词
Mixed-reality,Tele-presence,3D capture and rendering,Remote collaboration,Microsoft Kinect,Shared visual space,Hand gesture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要