Relation-Aware Global Attention

CoRR, 2019.

Cited by: 3|Views30
EI

Abstract:

Attention mechanism aims to increase the representation power by focusing on important features and suppressing unnecessary ones. For convolutional neural networks (CNNs), attention is typically learned with local convolutions, which ignores the global information and the hidden relation. How to efficiently exploit the long-range contex...More

Code:

Data:

Full Text
Bibtex
Your rating :
0

 

Tags
Comments