PAM: Pyramid Attention Mechanism Based on Contextual Reasoning.

IEEE Access(2019)

引用 0|浏览27
暂无评分
摘要
Recent work has shown that self-attention modules improve the performance of convolutional neural networks (CNNs), in which global operations are conventionally used to generate descriptors from feature context for attention calculation and characteristics recalibration. However, the performance gain is compromised due to sharing the same descriptor for different feature context. In this paper, we propose Pyramid Attention Mechanism (PAM) that incorporates contextual reasoning into self-attention module for enhancing the discriminative ability of descriptors. PAM is lightweight yet efficient and can be integrated with most self-attention modules. It consists of two operators: aggregation and distribution, which are used for assembling and synthesizing contextual information at different levels. Extensive experiments on different benchmarks (including CIFAR-100, ImageNet-1K, MS COCO, and VOC 2007) indicate that PAM can produce competitive performance gains. In classification tasks, by plugging PAM into self-attention modules, at most 2.18% accuracy improvement over various network structures can be obtained.
更多
查看译文
关键词
Convolutional neural networks,feature context,self-attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要