TY - JOUR
T1 - Federated Domain-Independent Prototype Learning With Alignments of Representation and Parameter Spaces for Feature Shift
AU - Fu, Lele
AU - Huang, Sheng
AU - Lai, Yanyi
AU - Zhang, Chuanfu
AU - Dai, Hong Ning
AU - Zheng, Zibin
AU - Chen, Chuan
N1 - The work was supported in part by the National Key Research and Development Program of China under Grant 2023YFB2703700, in part by the National Natural Science Foundation of China under Grant 62176269, and in part by the Guangzhou Science and Technology Program under Grant 2023A04J0314.
PY - 2025/9
Y1 - 2025/9
N2 - Federated learning provides a privacy-preserving modeling schema for distributed data, which coordinates multiple clients to collaboratively train a global model. However, data stored in different clients may be collected from diverse domains, and the resulting feature shift is prone to the degraded performance of global model. In this paper, we propose a Federated Domain-Independent Prototype Learning (FedDP) method with Alignments of Representation and Parameter Spaces for Feature Shift. Concretely, FedDP aims to eliminate the domain-specific information and explore the pure representations via information bottleneck, thus integrating the local and global domain-independent prototypes, respectively. To align the cross-domain representation spaces, the global domain-independent prototypes serve as the supervised signals to enable local intra-class representations to approach them. Further, to mitigate the divergences of optimization directions between multiple clients induced by the feature shift, the global representations are yielded by the global model on the client-side and guide the learning of local representations, thus unifying the parameter spaces of multiple local models. We derive the theoretical lower bound of the optimization objective based on mutual information, which is transformed into a computable loss. The proposed FedDP can be applied in the scenarios of homogeneous and heterogeneous models. Extensive experiments are conducted on three challenging multi-domain datasets. The experimental results illustrate the superiority of FedDP compared with state-of-the-art federated learning methods.
AB - Federated learning provides a privacy-preserving modeling schema for distributed data, which coordinates multiple clients to collaboratively train a global model. However, data stored in different clients may be collected from diverse domains, and the resulting feature shift is prone to the degraded performance of global model. In this paper, we propose a Federated Domain-Independent Prototype Learning (FedDP) method with Alignments of Representation and Parameter Spaces for Feature Shift. Concretely, FedDP aims to eliminate the domain-specific information and explore the pure representations via information bottleneck, thus integrating the local and global domain-independent prototypes, respectively. To align the cross-domain representation spaces, the global domain-independent prototypes serve as the supervised signals to enable local intra-class representations to approach them. Further, to mitigate the divergences of optimization directions between multiple clients induced by the feature shift, the global representations are yielded by the global model on the client-side and guide the learning of local representations, thus unifying the parameter spaces of multiple local models. We derive the theoretical lower bound of the optimization objective based on mutual information, which is transformed into a computable loss. The proposed FedDP can be applied in the scenarios of homogeneous and heterogeneous models. Extensive experiments are conducted on three challenging multi-domain datasets. The experimental results illustrate the superiority of FedDP compared with state-of-the-art federated learning methods.
KW - Feature shift
KW - federated learning
KW - information bottleneck
KW - prototype learning
KW - Federated learning
KW - feature shift
UR - https://www.scopus.com/pages/publications/105002743123
U2 - 10.1109/TMC.2025.3560083
DO - 10.1109/TMC.2025.3560083
M3 - Journal article
AN - SCOPUS:105002743123
SN - 1536-1233
VL - 24
SP - 9004
EP - 9019
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
IS - 9
ER -