Chrome Extension
WeChat Mini Program
Use on ChatGLM

Context-Aware Language Modeling for Goal-Oriented Dialogue Systems

arXiv (Cornell University)(2022)

Cited 12|Views117
No score
Abstract
Goal-oriented dialogue systems face a tradeoff between fluent language generation and task-specific control.While supervised learning with large language models is capable of producing realistic text, how to steer such responses towards completing a specific task without sacrificing language quality remains an open question.In this work, we formulate goal-oriented dialogue as a partially observed Markov decision process, interpreting the language model as a representation of both the dynamics and the policy.This view allows us to extend techniques from learning-based control, such as task relabeling, to derive a simple and effective method to finetune language models in a goal-aware way, leading to significantly improved task performance.We additionally introduce a number of training strategies that serve to better focus the model on the task at hand.We evaluate our method, Context-Aware Language Models (CALM), on a practical flight-booking task using AirDialogue.Empirically, CALM outperforms the state-of-theart method by 7% in terms of task success, matching human-level task performance.
More
Translated text
Key words
Context-Aware Applications,Spoken Dialogue Systems,Dialog Management,Topic Modeling
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined