Self-distillation and self-supervision for partial label learning

PATTERN RECOGNITION(2024)

引用 0|浏览26
暂无评分
摘要
As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.
更多
查看译文
关键词
Knowledge distillation,Self-supervised learning,Partial label learning,Machine learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要