High Throughput Deep Model of 3D Nucleus Instance Segmentation by Stereo Stitching Contextual Gaps.

ISBI(2023)

引用 0|浏览17
暂无评分
摘要
Whole-brain three dimensional (3D) nucleus instance segmentation is vital to quantify regional variations in many neuroscience studies using tissue clearing and microscopy imaging technologies. Due to the large size of whole-brain microscopy images (mu m vs. mm resolution in MRI leads voxel volume increasing in 109 level), however, it is computationally challenging to train an end-to-end deep model that can recognize nuclei instances in the full 3D volume. Instead, it is common practice to first segment 2D instances in each slice and then assemble them into 3D instances. Moreover, the whole-brain segmentation often comprises a collection of nucleus instance segmentation results in pre-partitioned image stacks, each with a manageable size for applying deep models. Complex arrangements of nuclei in close proximity makes stitching together 2D nucleus segmentations across slices non-trivial, leading to inconsistent segmentation along inter-slice and cross-stack gaps, which undermines the nucleus instance segmentation accuracy of current state-of-the-art deep models. To address this challenge we present a flexible learning-based stitching component which can be either integrated into existing deep models or used as a post-processing step. The backbone is a contextual graph model which is trained to predict the one-to-one correspondence between 2D segmentations along the gap. We have evaluated the performance of our stitching model for 3D nucleus instance segmentation from light-sheet microscopy images of mouse brains. After integrating our stitching model into existing methods, a significant improvement in instance segmentation accuracy is achieved by alleviating the inconsistency issue across discontinuous slices and stacks.
更多
查看译文
关键词
3D Microscopy Image, nucleus instance Segmentation, Mask RCNN, Graph Neural Network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要