Abstract
In federated learning (FL), model performance typically suffers from client drift induced by data heterogeneity, and mainstream works focus on correcting client drift. We propose a different approach named virtual homogeneity learning (VHL) to directly “rectify” the data heterogeneity. In particular, VHL conducts FL with a virtual homogeneous dataset crafted to satisfy two conditions: containing no private information and being separable. The virtual dataset can be generated from pure noise shared across clients, aiming to calibrate the features from the heterogeneous clients. Theoretically, we prove that VHL can achieve provable generalization performance on the natural distribution. Empirically, we demonstrate that VHL endows FL with drastically improved convergence speed and generalization performance. VHL is the first attempt towards using a virtual dataset to address data heterogeneity, offering new and effective means to FL.
Original language | English |
---|---|
Title of host publication | Proceedings of 39th International Conference on Machine Learning (ICML 2022) |
Editors | Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvari, Gang Niu, Sivan Sabato |
Publisher | ML Research Press |
Pages | 21111-21132 |
Number of pages | 22 |
Publication status | Published - 17 Jul 2022 |
Event | 39th International Conference on Machine Learning, ICML 2022 - Baltimore Convention Center , Baltimore, Maryland, United States Duration: 17 Jul 2022 → 23 Jul 2022 https://icml.cc/Conferences/2022 |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Volume | 162 |
ISSN (Print) | 2640-3498 |
Conference
Conference | 39th International Conference on Machine Learning, ICML 2022 |
---|---|
Country/Territory | United States |
City | Baltimore, Maryland |
Period | 17/07/22 → 23/07/22 |
Internet address |