AWB-GCN: A Graph Convolutional Network Accelerator with Runtime Workload Rebalancing release_qc4mhqewprfypa5425wukhsb7e

by Tong Geng, Ang Li, Tianqi Wang, Chunshu Wu, Yanfei Li, Runbin Shi, Antonino Tumeo, Shuai Che, Steve Reinhardt, Martin Herbordt

Released as a article .

2020  

Abstract

Deep learning systems have been applied mostly to Euclidean data such as images, video, and audio. In many applications, however, information and their relationships are better expressed with graphs. Graph Convolutional Networks (GCNs) appear to be a promising approach to efficiently learn from graph data structures, having shown advantages in many critical applications. As with other deep learning modalities, hardware acceleration is critical. The challenge is that real-world graphs are often extremely large and unbalanced; this poses significant performance demands and design challenges. In this paper, we propose Autotuning-Workload-Balancing GCN (AWB-GCN) to accelerate GCN inference. To address the issue of workload imbalance in processing real-world graphs, three hardware-based autotuning techniques are proposed: dynamic distribution smoothing, remote switching, and row remapping. In particular, AWB-GCN continuously monitors the sparse graph pattern, dynamically adjusts the workload distribution among a large number of processing elements (up to 4K PEs), and, after converging, reuses the ideal configuration. Evaluations are performed using an Intel D5005 FPGA with five commonly-used datasets. Results show that 4K-PE AWB-GCN can significantly elevate the average PE utilization (from 32.5% to 88.6%) and demonstrate considerable performance speedups over CPUs (7569x), GPUs (80.3x), and a prior GCN accelerator (7.4x).
In text/plain format

Archived Files and Locations

application/pdf  7.8 MB
file_hr5joys5bvbkzl6zsynskrgone
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2020-08-14
Version   v7
Language   en ?
arXiv  1908.10834v7
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: e21383df-b914-48a2-b23f-a6d845aa28ee
API URL: JSON