Self-Training Based Few-Shot Node Classification by Knowledge Distillation

Zongqian Wu,Yujie Mo, Peng Zhou, Shangbo Yuan,Xiaofeng Zhu

AAAI 2024(2024)

引用 0|浏览1
暂无评分
摘要
Self-training based few-shot node classification (FSNC) methods have shown excellent performance in real applications, but they cannot make the full use of the information in the base set and are easily affected by the quality of pseudo-labels. To address these issues, this paper proposes a new self-training FSNC method by involving the representation distillation and the pseudo-label distillation. Specifically, the representation distillation includes two knowledge distillation methods (i.e., the local representation distillation and the global representation distillation) to transfer the information in the base set to the novel set. The pseudo-label distillation is designed to conduct knowledge distillation on the pseudo-labels to improve their quality. Experimental results showed that our method achieves supreme performance, compared with state-of-the-art methods. Our code and a comprehensive theoretical version are available at https://github.com/zongqianwu/KD-FSNC.
更多
查看译文
关键词
ML: Semi-Supervised Learning,DMKM: Graph Mining, Social Network Analysis & Community,ML: Deep Learning Algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要