Pre-training Question Embeddings for Improving Knowledge Tracing with Self-supervised Bi-graph Co-contrastive Learning

ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA(2024)

引用 0|浏览2
暂无评分
摘要
Learning high-quality vector representations (aka. embeddings) of educational questions lies at the core of knowledge tracing (KT), which defines a task of estimating students' knowledge states by predicting the probability that they correctly answer questions. Although existing KT efforts have leveraged question information to achieve remarkable improvements, most of them learn question embeddings by following the supervised learning paradigm. In this article, we propose a novel question embedding pre-training method for improving knowledge tracing with self-supervised Bi-graph Co-contrastive learning (BiCo). Technically, on the basis of self-supervised learning paradigm, we first select two similar but distinct views (i.e., representing objective and subjective semantic perspectives) as the semantic source of question embeddings. Then, we design a primary task (structure recovery) together with two auxiliary tasks (question difficulty recovery and contrastive learning) to further enhance the representativeness of questions. Finally, extensive experiments conducted on two real-world datasets show BiCo has a higher expressive power that enables KT methods to effectively predict students' performances.
更多
查看译文
关键词
Knowledge tracing,self-supervised learning,bi-graph,co-contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要