AMR Parsing as Sequence-to-Graph Transduction
Meeting of the Association for Computational Linguistics, 2019.
EI
Abstract:
We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental res...More
Code:
Data:
Tags
Comments