Conveying Situational Information to People with Visual Impairments.

arXiv: Human-Computer Interaction(2019)

引用 0|浏览0
暂无评分
摘要
Knowing who is in oneu0027s vicinity is key to managing privacy in everyday environments, but is challenging for people with visual impairments. Wearable cameras and other sensors may be able to detect such information, but how should this complex visually-derived information be conveyed in a way that is discreet, intuitive, and unobtrusive? Motivated by previous studies on the specific information that visually impaired people would like to have about their surroundings, we created three medium-fidelity prototypes: 1) a 3D printed model of a watch to convey tactile information; 2) a smartwatch app for haptic feedback; and 3) a smartphone app for audio feedback. A usability study with 14 participants with visual impairments identified a range of practical issues (e.g., speed of conveying information) and design considerations (e.g., configurable privacy bubble) for conveying privacy feedback in real-world contexts.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要