Variational Knowledge Graph Reasoning
release_32gt3q2unvd2xgzl2mdmeftrhu
by
Wenhu Chen, Wenhan Xiong, Xifeng Yan, William Wang
2018
Abstract
Inferring missing links in knowledge graphs (KG) has attracted a lot of
attention from the research community. In this paper, we tackle a practical
query answering task involving predicting the relation of a given entity pair.
We frame this prediction problem as an inference problem in a probabilistic
graphical model and aim at resolving it from a variational inference
perspective. In order to model the relation between the query entity pair, we
assume that there exists an underlying latent variable (paths connecting two
nodes) in the KG, which carries the equivalent semantics of their relations.
However, due to the intractability of connections in large KGs, we propose to
use variation inference to maximize the evidence lower bound. More
specifically, our framework (Diva) is composed of three modules, i.e.
a posterior approximator, a prior (path finder), and a likelihood (path
reasoner). By using variational inference, we are able to incorporate them
closely into a unified architecture and jointly optimize them to perform KG
reasoning. With active interactions among these sub-modules, Diva is
better at handling noise and coping with more complex reasoning scenarios. In
order to evaluate our method, we conduct the experiment of the link prediction
task on multiple datasets and achieve state-of-the-art performances on both
datasets.
In text/plain
format
Archived Files and Locations
application/pdf 509.8 kB
file_chpwbuhljra5havcke6dceyf3i
|
arxiv.org (repository) web.archive.org (webarchive) |
1803.06581v2
access all versions, variants, and formats of this works (eg, pre-prints)