A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions

2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018)(2018)

引用 6|浏览8
暂无评分
摘要
Gaze gesture-based interactions on a computer are promising, but the existing systems are limited by the number of supported gestures, recognition accuracy, need to remember the stroke order, lack of extensibility, and so on. We present a gaze gesture-based interaction framework where a user can design gestures and associate them to appropriate commands like minimize, maximize, scroll, and so on. This allows the user to interact with a wide range of applications using a common set of gestures. Furthermore, our gesture recognition algorithm is independent of the screen size, resolution, and the user can draw the gesture anywhere on the target application. Results from a user study involving seven participants showed that the system recognizes a set of nine gestures with an accuracy of 93% and a F-measure of 0.96. We envision, this framework can be leveraged in developing solutions for situational impairments, accessibility, and also for implementing rich a interaction paradigm.
更多
查看译文
关键词
Gaze gestures, accessibility, situational impairment, eye tracking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要