Unsupervised Story Comprehension With Hierarchical Encoder-Decoder

PROCEEDINGS OF THE 2019 ACM SIGIR INTERNATIONAL CONFERENCE ON THEORY OF INFORMATION RETRIEVAL (ICTIR'19)(2019)

引用 2|浏览407
暂无评分
摘要
Commonsense understanding is a long-term goal of natural language processing yet to be resolved. One standard testbed for commonsense understanding is Story Cloze Test (SCT) [22], In SCT, given a 4-sentences story, we are expected to select the proper ending out of two proposed candidates. The training set in SCT only contains unlabeled stories, previous works usually adopt the small labeled development data for training, which ignored the sufficient training data and, essentially, not reveal the commonsense reasoning procedure. In this paper, we propose an unsupervised sequence-to-sequence method for story reading comprehension, we only adopt the unlabeled story and directly model the context-target inference probability. We propose a loss-reweight training strategy for the seq-to-seq model to dynamically tuning the training process. Experimental results demonstrate the advantage of the proposed model and it achieves the comparable results with supervised methods on SCT.
更多
查看译文
关键词
Machine Comprehension, Unsupervised Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要