谷歌浏览器插件
订阅小程序
在清言上使用

Improving AMR Parsing by Exploiting the Dependency Parsing As an Auxiliary Task

Multimedia tools and applications(2020)

引用 1|浏览31
暂无评分
摘要
Abstract meaning representations (AMRs) represent sentence semantics as rooted labeled directed acyclic graphs. Though there is a strong correlation between the AMR graph of a sentence and its corresponding dependency tree, the recent neural network AMR parsers do neglect the exploitation of dependency structure information. In this paper, we explore a novel approach to exploiting dependency structures for AMR parsing. Unlike traditional pipeline models, we treat dependency parsing as an auxiliary task for AMR parsing under the multi-task learning framework by sharing neural network parameters and selectively extracting syntactic representation by the attention mechanism. Particularly, to balance the gradients and focus on the AMR parsing task, we present a new dynamical weighting scheme in the loss function. The experimental results on the LDC2015E86 and LDC2017T10 dataset show that our dependency-auxiliary AMR parser significantly outperforms the baseline and its pipeline counterpart, and demonstrate that the neural AMR parsers can be greatly boosted with the help of effective methods of integrating syntax.
更多
查看译文
关键词
Abstract meaning representations,Multi-task learning,Dependency-auxiliary AMR parser,Neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要