Residual Attention Network for Image Classification

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)(2017)

引用 4094|浏览941
暂无评分
摘要
In this work, we propose "Residual Attention Network", a convolutional neural network using attention mechanism which can incorporate with state-of-art feed forward network architecture in an end-to-end training fashion. Our Residual Attention Network is built by stacking Attention Modules which generate attention-aware features. The attention-aware features from different modules change adaptively as layers going deeper. Inside each Attention Module, bottom-up top-down feedforward structure is used to unfold the feedforward and feedback attention process into a single feedforward process. Importantly, we propose attention residual learning to train very deep Residual Attention Networks which can be easily scaled up to hundreds of layers. Extensive analyses are conducted on CIFAR-10 and CIFAR-100 datasets to verify the effectiveness of every module mentioned above. Our Residual Attention Network achieves state-of-the-art object recognition performance on three benchmark datasets including CIFAR-10 (3.90 ImageNet (4.8 method achieves 0.6 forward FLOPs comparing to ResNet-200. The experiment also demonstrates that our network is robust against noisy labels.
更多
查看译文
关键词
attention residual,convolutional neural network,attention-aware features,deep residual attention networks,feed forward network architecture,image classification,CIFAR-10 datasets,CIFAR-100 datasets,ImageNet,object recognition performance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要