Text Information Aggregation with Centrality Attention
Abstract:
A lot of natural language processing problems need to encode the text sequence as a fix-length vector, which usually involves aggregation process of combining the representations of all the words, such as pooling or self-attention. However, these widely used aggregation approaches did not take higher-order relationship among the words i...More
Code:
Data:
Full Text
Tags
Comments