Allosteric Feature Collaboration for Model-Heterogeneous Federated Learning

Baoyao Yang*, Pong C. Yuen, Yiqun Zhang, An Zeng

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

Although federated learning (FL) has achieved outstanding results in privacy-preserved distributed learning, the setting of model homogeneity among clients restricts its wide application in practice. This article investigates a more general case, namely, model-heterogeneous FL (M-hete FL), where client models are independently designed and can be structurally heterogeneous. M-hete FL faces new challenges in collaborative learning because the parameters of heterogeneous models could not be directly aggregated. In this article, we propose a novel allosteric feature collaboration (AlFeCo) method, which interchanges knowledge across clients and collaboratively updates heterogeneous models on the server. Specifically, an allosteric feature generator is developed to reveal task-relevant information from multiple client models. The revealed information is stored in the client-shared and client-specific codes. We exchange client-specific codes across clients to facilitate knowledge interchange and generate allosteric features that are dimensionally variable for model updates. To promote information communication between different clients, a dual-path (model–model and model–prediction) communication mechanism is designed to supervise the collaborative model updates using the allosteric features. Client models are fully communicated through the knowledge interchange between models and between models and predictions. We further provide theoretical evidence and convergence analysis to support the effectiveness of AlFeCo in M-hete FL. The experimental results show that the proposed AlFeCo method not only performs well on classical FL benchmarks but also is effective in model-heterogeneous federated antispoofing. Our codes are publicly available at https://github.com/ybaoyao/AlFeCo.

Original languageEnglish
Pages (from-to)1-15
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusE-pub ahead of print - 25 Dec 2023

Scopus Subject Areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

User-Defined Keywords

  • Adaptation models
  • Collaboration
  • Collaborative learning
  • Computational modeling
  • Data models
  • feature generation
  • Federated learning
  • federated learning (FL)
  • model heterogeneity
  • Predictive models
  • Servers

Fingerprint

Dive into the research topics of 'Allosteric Feature Collaboration for Model-Heterogeneous Federated Learning'. Together they form a unique fingerprint.

Cite this