Attention-Challenging Multiple Instance Learning for Whole Slide Image Classification
arxiv(2023)
摘要
In the application of Multiple Instance Learning (MIL) methods for Whole
Slide Image (WSI) classification, attention mechanisms often focus on a subset
of discriminative instances, which are closely linked to overfitting. To
mitigate overfitting, we present Attention-Challenging MIL (ACMIL). ACMIL
combines two techniques based on separate analyses for attention value
concentration. Firstly, UMAP of instance features reveals various patterns
among discriminative instances, with existing attention mechanisms capturing
only some of them. To remedy this, we introduce Multiple Branch Attention (MBA)
to capture more discriminative instances using multiple attention branches.
Secondly, the examination of the cumulative value of Top-K attention scores
indicates that a tiny number of instances dominate the majority of attention.
In response, we present Stochastic Top-K Instance Masking (STKIM), which masks
out a portion of instances with Top-K attention values and allocates their
attention values to the remaining instances. The extensive experimental results
on three WSI datasets with two pre-trained backbones reveal that our ACMIL
outperforms state-of-the-art methods. Additionally, through heatmap
visualization and UMAP visualization, this paper extensively illustrates
ACMIL's effectiveness in suppressing attention value concentration and
overcoming the overfitting challenge. The source code is available at
.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要