Order Matters: Probabilistic Modeling of Node Sequence for Graph Generation release_rxymwg6dzvfbbfbew33zygctci

by Xiaohui Chen, Xu Han, Jiajing Hu, Francisco J. R. Ruiz, Liping Liu

Released as a article .

2021  

Abstract

A graph generative model defines a distribution over graphs. One type of generative model is constructed by autoregressive neural networks, which sequentially add nodes and edges to generate a graph. However, the likelihood of a graph under the autoregressive model is intractable, as there are numerous sequences leading to the given graph; this makes maximum likelihood estimation challenging. Instead, in this work we derive the exact joint probability over the graph and the node ordering of the sequential process. From the joint, we approximately marginalize out the node orderings and compute a lower bound on the log-likelihood using variational inference. We train graph generative models by maximizing this bound, without using the ad-hoc node orderings of previous methods. Our experiments show that the log-likelihood bound is significantly tighter than the bound of previous schemes. Moreover, the models fitted with the proposed algorithm can generate high-quality graphs that match the structures of target graphs not seen during training. We have made our code publicly available at [https://github.com/tufts-ml/graph-generation-vi]https://github.com/tufts-ml/graph-generation-vi.
In text/plain format

Archived Content

There are no accessible files associated with this release. You could check other releases for this work for an accessible version.

"Dark" Preservation Only
Save Paper Now!

Know of a fulltext copy of on the public web? Submit a URL and we will archive it

Type  article
Stage   submitted
Date   2021-06-14
Version   v2
Language   en ?
arXiv  2106.06189v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 288c25c4-0512-4a89-bdd5-fa30c2442c36
API URL: JSON