Self-Distilled Hierarchical Network for Unsupervised Deformable Image Registration

IEEE Transactions on Medical Imaging(2023)

引用 1|浏览2
暂无评分
摘要
Unsupervised deformable image registration benefits from progressive network structures such as Pyramid and Cascade. However, existing progressive networks only consider the single-scale deformation field in each level or stage and ignore the long-term connection across non-adjacent levels or stages. In this paper, we present a novel unsupervised learning approach named Self-Distilled Hierarchical Network (SDHNet). By decomposing the registration procedure into several iterations, SDHNet generates hierarchical deformation fields (HDFs) simultaneously in each iteration and connects different iterations utilizing the learned hidden state. Specifically, hierarchical features are extracted to generate HDFs through several parallel gated recurrent units, and HDFs are then fused adaptively conditioned on themselves as well as contextual features from the input image. Furthermore, different from common unsupervised methods that only apply similarity loss and regularization loss, SDHNet introduces a novel self-deformation distillation scheme. This scheme distills the final deformation field as the teacher guidance, which adds constraints for intermediate deformation fields on deformation-value and deformation-gradient spaces respectively. Experiments on five benchmark datasets, including brain MRI and liver CT, demonstrate the superior performance of SDHNet over state-of-the-art methods with a faster inference speed and a smaller GPU memory. Code is available at https://github.com/Blcony/SDHNet.
更多
查看译文
关键词
Medical registration,unsupervised learning,neural network,self-distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要