TY - JOUR
T1 - Build Yourself before Collaboration
T2 - Vertical Federated Learning with Limited Aligned Samples
AU - Shen, Wei
AU - Ye, Mang
AU - Yu, Wei
AU - Yuen, Pong C.
N1 - Funding Information:
This work is supported by National Natural Science Foundation of China (62361166629, 62176188, 62376200).
Publisher Copyright:
© 2025 IEEE.
PY - 2025/7
Y1 - 2025/7
N2 - Vertical Federated Learning (VFL) has emerged as a crucial privacy-preserving learning paradigm that involves training models using distributed features from shared samples. However, the performance of VFL can be hindered when the number of shared or aligned samples is limited, a common issue in mobile environments where user data are diverse and unaligned across multiple devices. Existing approaches use feature generation and pseudo-label estimation for unaligned samples to address this issue, unavoidably introducing noise during the generation process. In this work, we propose Local Enhanced Effective Vertical Federated Learning (LEEF-VFL), which fully utilizes unaligned samples in the local learning before collaboration. Unlike previous methods that overlook private labels owned by each client, we leverage these private labels to learn from all local samples, constructing robust local models to serve as solid foundations for collaborative learning. Additionally, we reveal that the limited number of aligned samples introduces distribution bias from global data distribution. In this case, we propose to minimize the distribution discrepancies between the aligned samples and the global data distribution to enhance collaboration. Extensive experiments demonstrate the effectiveness of LEEF-VFL in addressing the challenges of limited aligned samples, making it suitable for VFL in mobile computing environments.
AB - Vertical Federated Learning (VFL) has emerged as a crucial privacy-preserving learning paradigm that involves training models using distributed features from shared samples. However, the performance of VFL can be hindered when the number of shared or aligned samples is limited, a common issue in mobile environments where user data are diverse and unaligned across multiple devices. Existing approaches use feature generation and pseudo-label estimation for unaligned samples to address this issue, unavoidably introducing noise during the generation process. In this work, we propose Local Enhanced Effective Vertical Federated Learning (LEEF-VFL), which fully utilizes unaligned samples in the local learning before collaboration. Unlike previous methods that overlook private labels owned by each client, we leverage these private labels to learn from all local samples, constructing robust local models to serve as solid foundations for collaborative learning. Additionally, we reveal that the limited number of aligned samples introduces distribution bias from global data distribution. In this case, we propose to minimize the distribution discrepancies between the aligned samples and the global data distribution to enhance collaboration. Extensive experiments demonstrate the effectiveness of LEEF-VFL in addressing the challenges of limited aligned samples, making it suitable for VFL in mobile computing environments.
KW - Mobile applications
KW - security
KW - vertical federated learning
UR - http://www.scopus.com/inward/record.url?scp=85218765374&partnerID=8YFLogxK
U2 - 10.1109/TMC.2025.3543923
DO - 10.1109/TMC.2025.3543923
M3 - Journal article
AN - SCOPUS:85218765374
SN - 1536-1233
VL - 24
SP - 6503
EP - 6516
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
IS - 7
ER -