On Disharmony in Batch Normalization and Dropout Methods for Early Categorization of Alzheimer's Disease

SUSTAINABILITY(2022)

引用 7|浏览9
暂无评分
摘要
Alzheimer's disease (AD) is a global health issue that predominantly affects older people. It affects one's daily activities by modifying neural networks in the brain. AD is categorized by the death of neurons, the creation of amyloid plaques, and the development of neurofibrillary tangles. In clinical settings, an early diagnosis of AD is critical to limit the problems associated with it and can be accomplished using neuroimaging modalities, such as magnetic resonance imaging (MRI) and positron emission tomography (PET). Deep learning (DL) techniques are widely used in computer vision and related disciplines for various tasks such as classification, segmentation, detection, etc. CNN is a sort of DL architecture, which is normally useful to categorize and extract data in the spatial and frequency domains for image-based applications. Batch normalization and dropout are commonly deployed elements of modern CNN architectures. Due to the internal covariance shift between batch normalization and dropout, the models perform sub-optimally under diverse scenarios. This study looks at the influence of disharmony between batch normalization and dropout techniques on the early diagnosis of AD. We looked at three different scenarios: (1) no dropout but batch normalization, (2) a single dropout layer in the network right before the softmax layer, and (3) a convolutional layer between a dropout layer and a batch normalization layer. We investigated three binaries: mild cognitive impairment (MCI) vs. normal control (NC), AD vs. NC, AD vs. MCI, one multiclass AD vs. NC vs. MCI classification problem using PET modality, as well as one binary AD vs. NC classification problem using MRI modality. In comparison to using a large value of dropout, our findings suggest that using little or none at all leads to better-performing designs.
更多
查看译文
关键词
neuroimaging,classification,augmentation,statistical comparison,batch normalization,dropout
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要