Noisy Node Classification by Bi-level Optimization based Multi-teacher Distillation
CoRR(2024)
摘要
Previous graph neural networks (GNNs) usually assume that the graph data is
with clean labels for representation learning, but it is not true in real
applications. In this paper, we propose a new multi-teacher distillation method
based on bi-level optimization (namely BO-NNC), to conduct noisy node
classification on the graph data. Specifically, we first employ multiple
self-supervised learning methods to train diverse teacher models, and then
aggregate their predictions through a teacher weight matrix. Furthermore, we
design a new bi-level optimization strategy to dynamically adjust the teacher
weight matrix based on the training progress of the student model. Finally, we
design a label improvement module to improve the label quality. Extensive
experimental results on real datasets show that our method achieves the best
results compared to state-of-the-art methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要