WAFFLE: Weighted Averaging for Personalized Federated Learning release_vc4si452svagxlusw23lfwlm6m

by Martin Beaussart, Felix Grimberg, Mary-Anne Hartley, Martin Jaggi

Released as a article .

2021  

Abstract

In federated learning, model personalization can be a very effective strategy to deal with heterogeneous training data across clients. We introduce WAFFLE (Weighted Averaging For Federated LEarning), a personalized collaborative machine learning algorithm that leverages stochastic control variates for faster convergence. WAFFLE uses the Euclidean distance between clients' updates to weigh their individual contributions and thus minimize the personalized model loss on the specific agent of interest. Through a series of experiments, we compare our new approach to two recent personalized federated learning methods--Weight Erosion and APFL--as well as two general FL methods--Federated Averaging and SCAFFOLD. Performance is evaluated using two categories of non-identical client data distributions--concept shift and label skew--on two image data sets (MNIST and CIFAR10). Our experiments demonstrate the comparative effectiveness of WAFFLE, as it achieves or improves accuracy with faster convergence.
In text/plain format

Archived Files and Locations

application/pdf  987.2 kB
file_lzsilerxovbtlmx3ucz4aooq4m
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-12-13
Version   v2
Language   en ?
arXiv  2110.06978v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: af660e29-eda0-42d2-813f-6f2851451477
API URL: JSON