Molding CNNs for text: non-linear, non-consecutive convolutions

Conference on Empirical Methods in Natural Language Processing, pp. 1565-1575, 2015.

Cited by: 110|Bibtex|Views52|Links
EI

Abstract:

The success of deep learning often derives from well-chosen operational building blocks. In this work, we revise the temporal convolution operation in CNNs to better adapt it to text processing. Instead of concatenating word representations, we appeal to tensor algebra and use low-rank n-gram tensors to directly exploit interactions bet...More

Code:

Data:

Your rating :
0

 

Tags
Comments