WAFFLE: Weighted Averaging for Personalized Federated Learning
release_vc4si452svagxlusw23lfwlm6m
by
Martin Beaussart, Felix Grimberg, Mary-Anne Hartley, Martin Jaggi
2021
Abstract
In federated learning, model personalization can be a very effective strategy
to deal with heterogeneous training data across clients. We introduce WAFFLE
(Weighted Averaging For Federated LEarning), a personalized collaborative
machine learning algorithm that leverages stochastic control variates for
faster convergence. WAFFLE uses the Euclidean distance between clients' updates
to weigh their individual contributions and thus minimize the personalized
model loss on the specific agent of interest. Through a series of experiments,
we compare our new approach to two recent personalized federated learning
methods--Weight Erosion and APFL--as well as two general FL methods--Federated
Averaging and SCAFFOLD. Performance is evaluated using two categories of
non-identical client data distributions--concept shift and label skew--on two
image data sets (MNIST and CIFAR10). Our experiments demonstrate the
comparative effectiveness of WAFFLE, as it achieves or improves accuracy with
faster convergence.
In text/plain
format
Archived Files and Locations
application/pdf 987.2 kB
file_lzsilerxovbtlmx3ucz4aooq4m
|
arxiv.org (repository) web.archive.org (webarchive) |
2110.06978v2
access all versions, variants, and formats of this works (eg, pre-prints)