Experience
Education
Bio
I'm interested in building human-like language generation systems. Natural language generation (NLG) is a key component of many language technology applications such as dialogue systems, question answering systems, and story generation. However, they are yet far behind human-like or human-level generation. This is because a multitude of implicit information is NOT explicitly obvious on the surface. For instance, many different surface sentences can say the same meaning but still have slightly different surface outputs. We call the kinds of parameters as facets, that seem to be reflected in variations of a language, such as external knowledge, the intents, interpersonal information, speaker-internal information, and more. To generate human-like utterances, appropriate modeling of these facets is necessary, and the system needs to be effectively guided by these facets. Motivated by Halliday’s Systemic Functional Linguistics (SFL) theory (1978), my thesis focuses on three facet groups; knowledge, structure, and style, and presents effective computational methods for handling each facet in a wide range of generation tasks as follows:
Neural-symbolic integration for trustworthy and factual generation
Text planning for coherently structured generation
Cross-style language understanding for stylistically appropriate generation
Research Interests:
Computational Linguistics, Natural Language Processing, Machine Learning, Human-Computer Interaction
Natural Language Generation (NLG), Dialogue, Summarization, Distributional Semantics