Temporal Relationship Extraction For Natural Language Texts By Using Deep Bidirectional Language Model

2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020)(2020)

引用 4|浏览338
暂无评分
摘要
In general, documents contain temporal information and recognizing that information is crucial in understanding the overall content of documents written in natural language. To find the temporal information, there are three tasks that capturing the time representation itself, finding out the event associated with the time representation, and extracting the temporal relationship between times or events. As inherent linguistic characteristics of the multiple languages, it is hard to capture every time information from a given sentence without considering the context of temporal relationships. In this paper, we design an artificial neural network model that extracts temporal relations, one of the tasks that extract temporal information from natural language sentences. Our proposed model is based on a deep bidirectional architecture to design temporal relationships learning from given sentences. The model separates an input single sentence into individual word tokens and converts them into embedding vectors, and then learns whether each token is a subject or an object of temporal relationship information in the given sentence. Before using models and datasets that target multiple languages, we first conduct our research on English and Korean.
更多
查看译文
关键词
temporal information extraction, temporal relationships, temporal context, time information, time expressions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要