Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations
release_k6p3riupefgxpjvj53j7rjomca
by
Zhen Han, Ruotong Liao, Beiyan Liu, Yao Zhang, Zifeng Ding, Heinz Köppl, Hinrich Schütze, Volker Tresp
2022
Abstract
With the emerging research effort to integrate structured and unstructured
knowledge, many approaches incorporate factual knowledge into pre-trained
language models (PLMs) and apply the knowledge-enhanced PLMs on downstream NLP
tasks. However, (1) they only consider static factual knowledge, but knowledge
graphs (KGs) also contain temporal facts or events indicating evolutionary
relationships among entities at different timestamps. (2) PLMs cannot be
directly applied to many KG tasks, such as temporal KG completion.
In this paper, we focus on enhancing temporal knowledge embeddings
with contextualized language representations (ECOLA). We
align structured knowledge contained in temporal knowledge graphs with their
textual descriptions extracted from news articles and propose a novel
knowledge-text prediction task to inject the abundant information from
descriptions into temporal knowledge embeddings. ECOLA jointly optimizes the
knowledge-text prediction objective and the temporal knowledge embeddings,
which can simultaneously take full advantage of textual and knowledge
information. For training ECOLA, we introduce three temporal KG datasets with
aligned textual descriptions. Experimental results on the temporal knowledge
graph completion task show that ECOLA outperforms state-of-the-art temporal KG
models by a large margin. The proposed datasets can serve as new temporal KG
benchmarks and facilitate future research on structured and unstructured
knowledge integration.
In text/plain
format
Archived Files and Locations
application/pdf 1.1 MB
file_vqc3ceak6fdlfcnxlgughjpugi
|
arxiv.org (repository) web.archive.org (webarchive) |
2203.09590v1
access all versions, variants, and formats of this works (eg, pre-prints)