TY - JOUR
T1 - Federated Domain-Independent Prototype Learning With Alignments of Representation and Parameter Spaces for Feature Shift
AU - Fu, Lele
AU - Huang, Sheng
AU - Lai, Yanyi
AU - Zhang, Chuanfu
AU - Dai, Hong Ning
AU - Zheng, Zibin
AU - Chen, Chuan
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2025/4/11
Y1 - 2025/4/11
N2 - Federated learning provides a privacy-preserving modeling schema for distributed data, which coordinates multiple clients to collaboratively train a global model. However, data stored in different clients may be collected from diverse domains, and the resulting feature shift is prone to the degraded performance of global model. In this paper, we propose a Federated Domain-Independent Prototype Learning (FedDP) method with Alignments of Representation and Parameter Spaces for Feature Shift. Concretely, FedDP aims to eliminate the domain-specific information and explore the pure representations via information bottleneck, thus integrating the local and global domain-independent prototypes, respectively. To align the cross-domain representation spaces, the global domain-independent prototypes serve as the supervised signals to enable local intra-class representations to approach them. Further, to mitigate the divergences of optimization directions between multiple clients induced by the feature shift, the global representations are yielded by the global model on the client-side and guide the learning of local representations, thus unifying the parameter spaces of multiple local models. We derive the theoretical lower bound of the optimization objective based on mutual information, which is transformed into a computable loss. The proposed FedDP can be applied in the scenarios of homogeneous and heterogeneous models. Extensive experiments are conducted on three challenging multi-domain datasets. The experimental results illustrate the superiority of FedDP compared with state-of-the-art federated learning methods.
AB - Federated learning provides a privacy-preserving modeling schema for distributed data, which coordinates multiple clients to collaboratively train a global model. However, data stored in different clients may be collected from diverse domains, and the resulting feature shift is prone to the degraded performance of global model. In this paper, we propose a Federated Domain-Independent Prototype Learning (FedDP) method with Alignments of Representation and Parameter Spaces for Feature Shift. Concretely, FedDP aims to eliminate the domain-specific information and explore the pure representations via information bottleneck, thus integrating the local and global domain-independent prototypes, respectively. To align the cross-domain representation spaces, the global domain-independent prototypes serve as the supervised signals to enable local intra-class representations to approach them. Further, to mitigate the divergences of optimization directions between multiple clients induced by the feature shift, the global representations are yielded by the global model on the client-side and guide the learning of local representations, thus unifying the parameter spaces of multiple local models. We derive the theoretical lower bound of the optimization objective based on mutual information, which is transformed into a computable loss. The proposed FedDP can be applied in the scenarios of homogeneous and heterogeneous models. Extensive experiments are conducted on three challenging multi-domain datasets. The experimental results illustrate the superiority of FedDP compared with state-of-the-art federated learning methods.
KW - Feature shift
KW - federated learning
KW - information bottleneck
KW - prototype learning
UR - http://www.scopus.com/inward/record.url?scp=105002743123&partnerID=8YFLogxK
U2 - 10.1109/TMC.2025.3560083
DO - 10.1109/TMC.2025.3560083
M3 - Journal article
AN - SCOPUS:105002743123
SN - 1536-1233
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
ER -