Text Information Aggregation with Centrality Attention

Jingjing Gong
Jingjing Gong
Hang Yan
Hang Yan
Yining Zheng
Yining Zheng
Cited by: 0|Bibtex|Views31
Other Links: arxiv.org

Abstract:

A lot of natural language processing problems need to encode the text sequence as a fix-length vector, which usually involves aggregation process of combining the representations of all the words, such as pooling or self-attention. However, these widely used aggregation approaches did not take higher-order relationship among the words i...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments