An evaluation of assessment stability in a massive open online course using item response theory

EDUCATION AND INFORMATION TECHNOLOGIES(2024)

引用 0|浏览2
暂无评分
摘要
For Massive Open Online Courses to have trustworthy credentials, assessments in these courses must be valid, reliable, and fair. Item Response Theory provides a robust approach to evaluating these properties. However, for this theory to be applicable, certain properties of the assessment items should be met, among them that item difficulties are stable over time. The present study evaluates whether this property applies to the assessments of these courses using a case study - an AP Physics course provided by MITx. To do that, we estimated the item parameters in three administrations of the course and compared them across the administrations. We found that while many items did not meet certain quality criteria, more than a third of the items had stable characteristics over time. Our results demonstrate that Item Response Theory can help evaluate assessment quality in Massive Open Online Courses and suggest that these high-quality items can be used as the core of the course assessment as their properties are consistent across runs. We recommend that creators and instructors of online courses apply similar procedures to evaluate their courses' assessment and instructional design. Making sure the items have similar properties over time can support the assessment validity, reliability, and fairness of online courses.
更多
查看译文
关键词
Online courses,MOOCs,Item Response Theory,Reliability,Fairness,Online assessment,Assessment evaluation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要