Mobility Inference on Long-Tailed Sparse Trajectory release_wovtojdk2fdwtbhckpx3lmjy7u

by Lei Shi, Yuankai Luo, Shuai Ma, Hanghang Tong, Zhetao Li, Xiatian Zhang, Zhiguang Shan

Published in ACM Transactions on Intelligent Systems and Technology by Association for Computing Machinery (ACM).

2022  

Abstract

Analyzing the urban trajectory in cities has become an important topic in data mining. How can we model the human mobility consisting of stay and travel states from the raw trajectory data? How can we infer these mobility states from a single user's trajectory information? How can we further generalize the mobility inference to the real-world trajectory data that span multiple users and are sparsely sampled over time? In this paper, based on formal and rigid definitions of the stay/travel mobility, we propose a single trajectory inference algorithm that utilizes a generic long-tailed sparsity pattern in the large-scale trajectory data. The algorithm guarantees a 100% precision in the stay/travel inference with a provable lower bound in the recall metric. Furthermore, we design a transformer-like deep learning architecture on the problem of mobility inference from multiple sparse trajectories. Several adaptations from the standard transformer network structure are introduced, including the singleton design to avoid the negative effect of sparse labels in the decoder side, the customized space-time embedding on features of location records, and the mask apparatus at the output side for loss function correction. Evaluations on three trajectory datasets of 40 million urban users validate the performance guarantees of the proposed inference algorithm and demonstrate the superiority of our deep learning model, in comparison to sequence learning methods in the literature. On extremely sparse trajectories, the deep learning model improves from the single trajectory inference algorithm with more than two times of overall and F1 accuracy. The model also generalizes to large-scale trajectory data from different sources with good scalability.
In application/xml+jats format

Archived Files and Locations

application/pdf  1.6 MB
file_buznhksucfgr7i2cdxps77pgum
dl.acm.org (publisher)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article-journal
Stage   published
Date   2022-09-12
Language   en ?
Container Metadata
Not in DOAJ
In Keepers Registry
ISSN-L:  2157-6904
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 58d5154e-5edc-4bc6-8a97-38efec8e0311
API URL: JSON