On-Device Machine Learning: An Algorithms and Learning Theory Perspective
release_fokmxmy3x5g7ne7yggm4zpyqta
by
Sauptik Dhar, Junyao Guo, Jiayi Liu, Samarth Tripathi, Unmesh Kurup, Mohak Shah
2020
Abstract
The predominant paradigm for using machine learning models on a device is to
train a model in the cloud and perform inference using the trained model on the
device. However, with increasing number of smart devices and improved hardware,
there is interest in performing model training on the device. Given this surge
in interest, a comprehensive survey of the field from a device-agnostic
perspective sets the stage for both understanding the state-of-the-art and for
identifying open challenges and future avenues of research. However, on-device
learning is an expansive field with connections to a large number of related
topics in AI and machine learning (including online learning, model adaptation,
one/few-shot learning, etc.). Hence, covering such a large number of topics in
a single survey is impractical. This survey finds a middle ground by
reformulating the problem of on-device learning as resource constrained
learning where the resources are compute and memory. This reformulation allows
tools, techniques, and algorithms from a wide variety of research areas to be
compared equitably. In addition to summarizing the state-of-the-art, the survey
also identifies a number of challenges and next steps for both the algorithmic
and theoretical aspects of on-device learning.
In text/plain
format
Archived Files and Locations
application/pdf 1.4 MB
file_2vawox6rdjeynntsu23e46mdwi
|
arxiv.org (repository) web.archive.org (webarchive) |
1911.00623v2
access all versions, variants, and formats of this works (eg, pre-prints)