Web Accessibility Evaluation In A Crowdsourcing-Based System With Expertise-Based Decision Strategy

15TH INTERNATIONAL WEB FOR ALL CONFERENCE (W4A) 2018(2018)

引用 7|浏览29
暂无评分
摘要
The rising awareness of accessibility increases the demand for Web accessibility evaluation projects to verify the implementation of Web accessibility guidelines and identify accessibility barriers in websites. However, the complexity of accessibility evaluation tasks and the lack of experts limits their scope and reduces their significance. Due to this complexity, they could not directly rely on a technique called crowdsourcing, which made great contributions in many fields by dividing a problem into many tedious micro-tasks and solving tasks in parallel. Addressing this issue, we develop a new crowdsourcing-based Web accessibility evaluation system with two novel decision strategies, golden set strategy and time-based golden set strategy. These strategies enable the generation of task results with high accuracy synthesized from micro-tasks solved by workers with heterogeneous expertise. An accessibility evaluation of 98 websites by 55 workers with varying experience verifies that our system can complete the evaluation in half the time with a 7.2% improvement on accuracy than the current approach.
更多
查看译文
关键词
Web Accessibility, Evaluation System, Expertise, Crowdsourcing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要