DTPP-DFL: A Dropout-Tolerated Privacy-Preserving Decentralized Federated Learning Framework

Tao Chen, Xiao Fen Wang*, Hong Ning Dai, Hao Miao Yang, Rang Zhou, Xiao Song Zhang

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Federated Learning (FL) enables participants to collaboratively train a global model by sharing their gradients without the need for uploading privacy-sensitive data. Despite certain privacy preservation of FL, local gradients in plaintext may reveal data privacy when gradient-leakage attacks are launched. To further protect local gradients, privacy-preserving FL schemes have been proposed. However, these existing schemes that require a fully trusted central server are vulnerable to a single point of failure and malicious attacks. Although more robust privacy-preserving decentralized FL schemes have recently been proposed on multiple servers, they will fail to aggregate the local gradients with message transmission errors or data packet dropping out due to the instability of the communication network. To address these challenges, we propose a novel privacy-preserving decentralized FL scheme system based on the blockchain and a modified identity-based homomorphic broadcast encryption algorithm. This scheme achieves both privacy protection and error/dropout tolerance. Security analysis shows that the proposed scheme can protect the privacy of the local gradients against both internal and external adversaries, and protect the privacy of the global gradients against external adversaries. Moreover, it ensures the correctness of local gradients' aggregation even when transmission error or data packet dropout happens. Extensive experiments demonstrate that the proposed scheme guarantees model accuracy and achieves performance efficiency.

Original languageEnglish
Title of host publicationGLOBECOM 2023 - 2023 IEEE Global Communications Conference
PublisherIEEE
Pages2554-2559
Number of pages6
ISBN (Electronic)9798350310900
ISBN (Print)9798350310917
DOIs
Publication statusPublished - Dec 2023
Event2023 IEEE Global Communications Conference, GLOBECOM 2023 - Kuala Lumpur, Malaysia
Duration: 4 Dec 20238 Dec 2023
https://globecom2023.ieee-globecom.org/
https://ieeexplore.ieee.org/xpl/conhome/10436708/proceeding

Publication series

NameProceedings - IEEE Global Communications Conference, GLOBECOM
ISSN (Print)1930-529X
ISSN (Electronic)2576-6813

Conference

Conference2023 IEEE Global Communications Conference, GLOBECOM 2023
Country/TerritoryMalaysia
CityKuala Lumpur
Period4/12/238/12/23
Internet address

Scopus Subject Areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Hardware and Architecture
  • Signal Processing

User-Defined Keywords

  • Blockchain
  • Decentralized
  • Dropout-Tolerated
  • Federated Learning
  • Privacy-Preserving

Fingerprint

Dive into the research topics of 'DTPP-DFL: A Dropout-Tolerated Privacy-Preserving Decentralized Federated Learning Framework'. Together they form a unique fingerprint.

Cite this