Explainable Learning Analytics: Assessing the stability of student success prediction models by means of explainable AI

Decision Support Systems(2024)

引用 0|浏览3
暂无评分
摘要
Beyond managing student dropout, higher education stakeholders need decision support to consistently influence the student learning process to keep students motivated, engaged, and successful. At the course level, the combination of predictive analytics and self-regulation theory can help instructors determine the best study advice and allow learners to better self-regulate and determine how they want to learn. The best performing techniques are often black box models that favor performance over interpretability and are heavily influenced by course contexts. In this study, we argue that explainable AI has the potential not only to uncover the reasons behind model decisions, but also to reveal their stability across contexts, effectively bridging the gap between predictive and explanatory learning analytics (LA). In contributing to decision support systems research, this study (1) leverages traditional techniques, such as concept drift and performance drift, to investigate the stability of student success prediction models over time; (2) uses Shapley Additive explanations in a novel way to explore the stability of extracted feature importance rankings generated for these models; (3) generates new insights that emerge from stable features across cohorts, enabling teachers to determine study advice. We believe this study makes a strong contribution to education research at large and expands the field of LA by augmenting the interpretability and explainability of prediction algorithms and ensuring their applicability in changing contexts.
更多
查看译文
关键词
Learning analytics,Self-regulated learning,Explainable AI,Model stability
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要