Multi-Level Graph Contrastive Learning release_c7zsvxgiczcj7f5evasefn4yue

by Pengpeng Shao, Tong Liu, Dawei Zhang, Jianhua Tao, Feihu Che, Guohua Yang

Released as a article .

2021  

Abstract

Graph representation learning has attracted a surge of interest recently, whose target at learning discriminant embedding for each node in the graph. Most of these representation methods focus on supervised learning and heavily depend on label information. However, annotating graphs are expensive to obtain in the real world, especially in specialized domains (i.e. biology), as it needs the annotator to have the domain knowledge to label the graph. To approach this problem, self-supervised learning provides a feasible solution for graph representation learning. In this paper, we propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs. Specifically, we introduce a novel contrastive view - topological and feature space views. The original graph is first-order approximation structure and contains uncertainty or error, while the kNN graph generated by encoding features preserves high-order proximity. Thus kNN graph generated by encoding features not only provide a complementary view, but is more suitable to GNN encoder to extract discriminant representation. Furthermore, we develop a multi-level contrastive mode to preserve the local similarity and semantic similarity of graph-structured data simultaneously. Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
In text/plain format

Archived Files and Locations

application/pdf  483.9 kB
file_yywoiuah4vbqfn5bfrpa7ktj54
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-07-06
Version   v1
Language   en ?
arXiv  2107.02639v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: e634c93e-4c4d-49d9-ae97-57266861627d
API URL: JSON