Zero-Shot Intent Classification Using a Semantic Similarity Aware Contrastive Loss and Large Language Model

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览6
暂无评分
摘要
Zero-shot systems can reduce the cost of collecting data and training in a new domain since they can work directly with the test data without further training. In this paper, we build zero-shot systems for intent classification, based on Semantic Similarity-aware Contrastive Loss (SSCL) that addresses an issue in the original CL which treats non-corresponding pairs indiscriminately. We confirm that SSCL outperforms CL through experiments. Then, we explore how including text or speech in-domain data during the SSCL training affects the out-of-domain intent classification.During the zero-shot classification, embeddings for a set of classes in the new domain are generated to calculate the similarities between each class embedding and an input utterance embedding, after which the most similar class is predicted for the utterance’s intent. Although manually-collected text sentences per class can be used to generate the class embedding, the data collection can be costly. Thus, we explore how to generate better class embeddings without human-collected text data in the target domain. The best proposed method employing an instruction-tuned Llama2, a public large language model, shows the performance comparable to the case where the human-collected text data was used, implying the importance of accurate class embedding generation.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要