谷歌浏览器插件
订阅小程序
在清言上使用

DABP: A Domain Augmentation and Bidirectional Stack-Propagation Model for Task-Oriented NLU

Communications in computer and information science(2023)

引用 0|浏览4
暂无评分
摘要
Natural language understanding (NLU) is the key part of task-oriented dialogue systems. Nowadays, most existing task-oriented NLU models use pre-trained models (PTMs) for semantic encoding, but those PTMs often perform poorly on specific task-oriented dialogue data due to small data volume and lack of domain-specific knowledge. Besides that, most joint modeling models of slot filling and intention detection only use a joint loss function, or only provides a one-way semantic connection, which fails to achieve the interaction of information between the two tasks at a deep level. In this paper, we propose a Domain Augmentation and Bidirectional Stack Propagation (DABP) model for NLU. In the proposed model, we use the masked language model (MLM) task and the proposed part-of-speech tagging task to enhance PTMs with domain-specific knowledge include both implicit and explicit. Besides that, we propose a bidirectional stack-propagation mechanism to propagate the information between the two tasks. Experimental results show that the proposed model can achieve better performance than the state-of-the-art models on the ATIS and SNIPS datasets.
更多
查看译文
关键词
domain augmentation,stack-propagation,task-oriented
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要