Conveying language through haptics: a multi-sensory approach.

UbiComp '18: The 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing Singapore Singapore October, 2018(2018)

引用 41|浏览24
暂无评分
摘要
In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86% accuracy in a 50 word identification task after 100 minutes of training.
更多
查看译文
关键词
Haptics, Multi-sensory, Wearable, Speech
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要