q2d: Turning Questions into Dialogs to Teach Models How to Search.

Yonatan Bitton, Shlomi Cohen-Ganor,Ido Hakimi, Yoad Lewenberg,Roee Aharoni, Enav Weinreb

CoRR(2023)

引用 1|浏览74
暂无评分
摘要
One of the exciting capabilities of recent language models for dialog is their ability to independently search for relevant information to ground a given dialog response. However, obtaining training data to teach models how to issue search queries is time and resource consuming. In this work, we propose q2d: an automatic data generation pipeline that generates information-seeking dialogs from questions. We prompt a large language model (PaLM) to create conversational versions of question answering datasets, and use it to improve query generation models that communicate with external search APIs to ground dialog responses. Unlike previous approaches which relied on human written dialogs with search queries, our method allows to automatically generate query-based grounded dialogs with better control and scale. Our experiments demonstrate that: (1) For query generation on the QReCC dataset, models trained on our synthetically-generated data achieve 90%--97% of the performance of models trained on the human-generated data; (2) We can successfully generate data for training dialog models in new domains without any existing dialog data as demonstrated on the multi-hop MuSiQue and Bamboogle QA datasets. (3) We perform a thorough analysis of the generated dialogs showing that humans find them of high quality and struggle to distinguish them from human-written dialogs.
更多
查看译文
关键词
dialogs,models,search,questions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络