Guided-GAN: Adversarial Representation Learning for Activity Recognition with Wearables release_6syrhqvibjecrcdfkv42geh3vi

by Alireza Abedin, Hamid Rezatofighi, Damith C. Ranasinghe

Released as a article .

2021  

Abstract

Human activity recognition (HAR) is an important research field in ubiquitous computing where the acquisition of large-scale labeled sensor data is tedious, labor-intensive and time consuming. State-of-the-art unsupervised remedies investigated to alleviate the burdens of data annotations in HAR mainly explore training autoencoder frameworks. In this paper: we explore generative adversarial network (GAN) paradigms to learn unsupervised feature representations from wearable sensor data; and design a new GAN framework-Geometrically-Guided GAN or Guided-GAN-for the task. To demonstrate the effectiveness of our formulation, we evaluate the features learned by Guided-GAN in an unsupervised manner on three downstream classification benchmarks. Our results demonstrate Guided-GAN to outperform existing unsupervised approaches whilst closely approaching the performance with fully supervised learned representations. The proposed approach paves the way to bridge the gap between unsupervised and supervised human activity recognition whilst helping to reduce the cost of human data annotation tasks.
In text/plain format

Archived Files and Locations

application/pdf  1.6 MB
file_3c6bp5kk4zbz3ceegzws76u5la
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-10-12
Version   v1
Language   en ?
arXiv  2110.05732v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 72691563-3d72-409e-bb61-30c17ea13986
API URL: JSON