Learning Chebyshev Basis in Graph Convolutional Networks for Skeleton-based Action Recognition release_p7ekmlhuazbphnjhat6of25mrq

by Hichem Sahbi

Released as a article .

2021  

Abstract

Spectral graph convolutional networks (GCNs) are particular deep models which aim at extending neural networks to arbitrary irregular domains. The principle of these networks consists in projecting graph signals using the eigen-decomposition of their Laplacians, then achieving filtering in the spectral domain prior to back-project the resulting filtered signals onto the input graph domain. However, the success of these operations is highly dependent on the relevance of the used Laplacians which are mostly handcrafted and this makes GCNs clearly sub-optimal. In this paper, we introduce a novel spectral GCN that learns not only the usual convolutional parameters but also the Laplacian operators. The latter are designed "end-to-end" as a part of a recursive Chebyshev decomposition with the particularity of conveying both the differential and the non-differential properties of the learned representations -- with increasing order and discrimination power -- without overparametrizing the trained GCNs. Extensive experiments, conducted on the challenging task of skeleton-based action recognition, show the generalization ability and the outperformance of our proposed Laplacian design w.r.t. different baselines (built upon handcrafted and other learned Laplacians) as well as the related work.
In text/plain format

Archived Files and Locations

application/pdf  767.2 kB
file_3zy3543io5ewxiivsxztvysksm
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-04-12
Version   v1
Language   en ?
arXiv  2104.05482v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: fc4b9997-1b12-4839-abc0-0fc15b61e7c3
API URL: JSON