Order Matters: Probabilistic Modeling of Node Sequence for Graph Generation
release_csdkn376mzduhfyrssax7fbxxy
by
Xiaohui Chen, Xu Han, Jiajing Hu, Francisco J. R. Ruiz, Liping Liu
2021
Abstract
A graph generative model defines a distribution over graphs. One type of
generative model is constructed by autoregressive neural networks, which
sequentially add nodes and edges to generate a graph. However, the likelihood
of a graph under the autoregressive model is intractable, as there are numerous
sequences leading to the given graph; this makes maximum likelihood estimation
challenging. Instead, in this work we derive the exact joint probability over
the graph and the node ordering of the sequential process. From the joint, we
approximately marginalize out the node orderings and compute a lower bound on
the log-likelihood using variational inference. We train graph generative
models by maximizing this bound, without using the ad-hoc node orderings of
previous methods. Our experiments show that the log-likelihood bound is
significantly tighter than the bound of previous schemes. Moreover, the models
fitted with the proposed algorithm can generate high-quality graphs that match
the structures of target graphs not seen during training. We have made our code
publicly available at
[https://github.com/tufts-ml/graph-generation-vi]https://github.com/tufts-ml/graph-generation-vi.
In text/plain
format
Archived Files and Locations
application/pdf 11.1 MB
file_gy4eo5l6nbbqhiekrlxgbilzkm
|
arxiv.org (repository) web.archive.org (webarchive) |
2106.06189v1
access all versions, variants, and formats of this works (eg, pre-prints)