摘要

Event temporal relation extraction is an important natural language understanding task, which can be widely used in downstream tasks such as construction of knowledge graph, question answering system and narrative generation. Existing event temporal relation extraction methods often treat the task as a sentence-level event pair classification problem, and solve it by some classification model. However, based on limited local sentence information, the accuracy of the extraction of temporal relations among events is low and the global consistency of the temporal relations cannot be guaranteed. For this problem, this paper proposes a document-level event temporal relation extraction with context information, which uses the neural network model based on Bi-LSTM (bidirectional long short-term memory) to learn the temporal relation expressions of event pairs, and then uses the self-attention mechanism to combine the information of other event pairs in the context, to obtain a better event temporal relation expression for temporal relation classification. At last, that event temporal relation expression with context information will improve the global event temporal relation extraction by enhancing temporal relation classification of all event pairs in the Experiments on TB-Dense (timebank dense) dataset and MATRES (multi-axis temporal relations for start-points) dataset show that this method can achieve better results than the latest sentence-level methods.