Deep RNNs Encode Soft Hierarchical Syntax

meeting of the association for computational linguistics, 2018.

被引用42|引用|浏览40|DOI:https://doi.org/10.18653/v1/p18-2003
EI
其它链接dblp.uni-trier.de|academic.microsoft.com|arxiv.org

摘要

We present a set of experiments to demonstrate that deep recurrent neural networks (RNNs) learn internal representations that capture soft hierarchical notions of syntax from highly varied supervision. We consider four syntax tasks at different depths of the parse tree; for each word, we predict its part of speech as well as the first (pa...更多

代码

数据

下载 PDF 全文
您的评分 :
0

 

标签
评论