A Dropout-Tolerated Privacy-Preserving Method for Decentralized Crowdsourced Federated Learning

Tao Chen, Xiaofen Wang*, Hong Ning Dai, Haomiao Yang

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

1 Citation (Scopus)


Mobile crowdsourcing federated learning (FL-MCS) allows a requester to outsource its model-training tasks to other workers who have the desired data as well as strong computing power. FL-MCS can thereby overcome the limitations of computing capability as well as the data availability of participants. However, FL-MCS still faces the problem of workers’ data privacy leakage when diverse malicious attacks (e.g., gradient inference attacks) are launched. To address these problems, some privacy-preserving FL-MCS (PPFL-MCS) schemes are proposed to aggregate local models at a central server. Unfortunately, these schemes are vulnerable to single-point-of-failure and other malicious attacks at the central server. Meanwhile, the workers may drop from the online task due to the erratic communication network in PPFL-MCS schemes, thereby resulting in the failure of the entire model aggregation. To solve these issues, we propose a novel dropout-tolerated and privacy-preserving decentralized FL-MCS scheme, namely, dropout-tolerated decentralized PPFL-MCS based on blockchain. Specifically, we define a novel cryptographic primitive, i.e., ID-based Aggregated Decryptable Broadcast Encryption (AD-IBBE) based on traditional ID-based broadcast encryption. In AD-IBBE, the senders’ ciphertexts can only be decrypted by themselves while the aggregated ciphertexts can be decrypted by all receivers in the broadcast group. Then, we design a homomorphic AD-IBBE algorithm, which is formally proved to be semantically secure. We next devise the decentralized PPFL-MCS scheme to guarantee the confidentiality of model gradients against internal and external adversaries. Moreover, we design a dropout-tolerated aggregation method to ensure the robustness of our decentralized PPFL-MCS scheme even if some workers lose connection. Extensive experimental results on different models and data sets demonstrate that the proposed scheme guarantees a close model accuracy to the nondropout case. Even when some workers are offline, our scheme still performs more efficiently than existing schemes in terms of dropout aggregation overhead.

Original languageEnglish
Pages (from-to)1788-1799
Number of pages12
JournalIEEE Internet of Things Journal
Issue number2
Early online date15 Sept 2023
Publication statusPublished - 15 Jan 2024

Scopus Subject Areas

  • Signal Processing
  • Information Systems
  • Hardware and Architecture
  • Computer Science Applications
  • Computer Networks and Communications

User-Defined Keywords

  • Decentralized
  • dropout tolerated
  • federated learning (FL)
  • mobile crowdsourcing
  • privacy preserving


Dive into the research topics of 'A Dropout-Tolerated Privacy-Preserving Method for Decentralized Crowdsourced Federated Learning'. Together they form a unique fingerprint.

Cite this