Information Selection-based Domain Adaptation from Black-box Predictors.

ICME(2023)

引用 0|浏览12
暂无评分
摘要
Unsupervised domain adaptation aims to address the problem of under labeling by performing knowledge transfer between labeled source domains and unlabeled target domains. Despite impressive progress, learning methods that rely on raw data and raw source model parameters all have the potential for privacy leakage in real life. In recent studies, the source model is set up as a black-box model with only inputs and outputs available, and knowledge distillation is introduced to fit the target model. However, the results of knowledge distillation are affected by confusion-prone instances and incorrect predictions of teacher networks, so we propose an Information Selection-based Knowledge Distillation (ISKD) strategy to perform more efficient distillation. We first perform semantic-level optimization of the source model output information through the association of categories and then filter the instance-level information with constructed confidence scores. In addition to this, the introduction of the self-distillation mechanism further improves the model performance. We conduct experiments on three benchmark datasets and obtain state-of-the-art performance.
更多
查看译文
关键词
Unsupervised domain adaptation,Knowledge Distillation,Black-box model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要