Alleviating Style Sensitivity then Adapting: Source-free Domain Adaptation for Medical Image Segmentation

International Multimedia Conference(2022)

引用 7|浏览108
暂无评分
摘要
ABSTRACTRecently, source-free domain adaptation (SFDA) has attracted extensive attention in medical image segmentation due to the ability of knowledge transfer without accessing source data. However, existing SFDA methods suffer from severe performance degradation since the style of the target data shifts from the source. Although traditional unsupervised domain adaptation (UDA) methods are capable of addressing the style shifts issue using both domain data, they fail to extract the source style due to a lack of source data in source-free scenarios. In this paper, we propose a novel style-insensitive source-free domain adaptation framework (SI-SFDA) for medical image segmentation to reduce the impacts of style shifts. The proposed framework first pretrains a generalized source model and then adapts the source model in a source data-free manner. Towards the former, a cross-patch style generalization (CPSG) mechanism is introduced to reduce the style sensitivity of the source model via a self-training paradigm with Transformer structure. Towards the latter, an adaptive confidence regularization (ACR) loss with dynamic scaling strategy is developed to further reduce the classification confusion caused by style shifts. The proposed ACR loss is model-independent so that it can be used with other methods to improve the segmentation performance. Extensive experiments are conducted on five public medical image benchmarks, the promising performance on organ and fundus segmentation tasks demonstrates the effectiveness of our framework.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要