Unsupervised Recurrent Neural Network Grammars

Adhiguna Kuncoro
Adhiguna Kuncoro
Gábor Melis
Gábor Melis

arXiv: Computation and Language, 2019.

被引用30|引用|浏览206
EI
其它链接dblp.uni-trier.de|academic.microsoft.com|arxiv.org

摘要

Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In...更多

代码

数据

下载 PDF 全文
您的评分 :
0

 

标签
评论