谷歌浏览器插件
订阅小程序
在清言上使用

Incorporating BERT with Probability-Aware Gate for Spoken Language Understanding

IEEE/ACM transactions on audio, speech, and language processing(2023)

引用 6|浏览65
暂无评分
摘要
Spoken language understanding (SLU) is an essential part of a task-oriented dialogue system, which mainly includes intent detection and slot filling. Some existing approaches obtain enhanced semantic representation by establishing the correlation between two tasks. However, those methods show little improvement when applied to BERT, since BERT has learned rich semantic features. In this paper, we propose a BERT-based model with the probability-aware gate mechanism, called PAGM ( P robability A ware G ated M odel). PAGM aims to learn the correlation between intent and slot from the perspective of probability distribution, which explicitly utilizes intent information to guide slot filling. Besides, in order to efficiently incorporate BERT with the probability-aware gate, we design the stacked fine-tuning strategy. This approach introduces a mid-stage before target model training, which enables BERT to get better initialization for final training. Experiments show that PAGM achieves significant improvement on two benchmark datasets, and outperforms the previous state-of-the-art results.
更多
查看译文
关键词
Training,Correlation,Bit error rate,Semantics,Natural languages,Logic gates,Filling,Natural language processing,spoken language understanding,intent detection,slot filling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要